MUNIT
MUNIT copied to clipboard
Question about batch_size and training time
When the batch_size is 1, the training time of each iteration is 0.3s. However, when I changed the batch_size to 3, the training time of each iteration becomes 1.9s, which is about 6x than before. This is really different from other models I have trained. Could anyone tell me why this happened, or is this a normal case?
It seems that it's because of the version of pytorch. When I train with pytorch1.2.0, the speed seems reasonable!
When the batch size is larger, the speed is slower? Or can I increase the batch size and reduce the number of training?
When you increase the batch size, you are training more images in each iteration. Therefore, you spend more time in each iteration. It is true that you can increase the batch size while reducing the iterations. However, as this experiment shows, batch size = 1 delivers the best result.