Neural-Texture-Extraction-Distribution icon indicating copy to clipboard operation
Neural-Texture-Extraction-Distribution copied to clipboard

I have a question about dataloader(lmdb)

Open gkalstn000 opened this issue 2 years ago • 0 comments

Hello,

I am writing to ask a question about the dataloader batch size. I tried increasing the batch size from the default value to fully utilize my GPU memory for training. However, I noticed that the average iteration time increased even when I increased the num_workers. For example, when training with a batch size of 4 for a resolution of 256, the average iteration time was much faster than when I increased it to 16.

I would like to know the reason why using a larger batch size seems to slow down the training process instead of speeding it up.

Thank you.

gkalstn000 avatar Jul 21 '23 02:07 gkalstn000