InterFuser
InterFuser copied to clipboard
batchsize
Hi, Does batchsize matter? Because you used 8 GPUs for training and I only have 2 GPUs, do I need to adjust the batchsize to make the effective batch sizes equal?
Hi, It's better to train the model with the same batch size. However, if you're limited by the number of GPUs or available GPU memory, adjusting the learning rate proportionally to the batch size can help achieve similar performance.