Neural-Texture-Extraction-Distribution
Neural-Texture-Extraction-Distribution copied to clipboard
I have a question about dataloader(lmdb)
Hello,
I am writing to ask a question about the dataloader batch size. I tried increasing the batch size from the default value to fully utilize my GPU memory for training. However, I noticed that the average iteration time increased even when I increased the num_workers. For example, when training with a batch size of 4 for a resolution of 256, the average iteration time was much faster than when I increased it to 16.
I would like to know the reason why using a larger batch size seems to slow down the training process instead of speeding it up.
Thank you.