Batch size 4 vs 8
hi I have a 3050 6gb,so the best Batch size i can work with is 4 but i want to know how much time would be saved if it was a 3080 12gb ? (so being able to use Batch size 8 get possible) is it worth renting one on the web?
(rn,for every epoch with ~15 min vocal,it takes around 1:05 per epoch)
In my experience, doubling the batch size didn't make the training much faster. You can simulate its benefits by training your models on Colab and comparing the training time with different batch sizes.
In my experience, doubling the batch size didn't make the training much faster. You can simulate its benefits by training your models on Colab and comparing the training time with different batch sizes.
Hi what about having a newer gpu?does that make a big difference? (Like instead of that 3050, using a 4060)
because ChatGPT says if a model takes 5 hours to be trained,it would take around two hours with 4060 (which i doubt)