InterFuser icon indicating copy to clipboard operation
InterFuser copied to clipboard

batchsize

Open No4x opened this issue 1 year ago • 1 comments

Hi, Does batchsize matter? Because you used 8 GPUs for training and I only have 2 GPUs, do I need to adjust the batchsize to make the effective batch sizes equal?

No4x avatar Apr 21 '24 21:04 No4x

Hi, It's better to train the model with the same batch size. However, if you're limited by the number of GPUs or available GPU memory, adjusting the learning rate proportionally to the batch size can help achieve similar performance.

deepcs233 avatar May 05 '24 15:05 deepcs233