caps icon indicating copy to clipboard operation
caps copied to clipboard

Dataloader problem

Open vivek1527 opened this issue 3 years ago • 4 comments

Screenshot from 2021-10-14 22-40-19 Hi,

I was trying to run the training from scratch and I am encountering this error. Do you know any particular reasons for this error. I am using batch_size of 1 and GPU of 4GB

vivek1527 avatar Oct 14 '21 20:10 vivek1527

image

Hi, Can you please let me know the root cause of this error as I have tried many ways to resolve it but it reappears again and again

VivekRamayanam19 avatar Oct 20 '21 17:10 VivekRamayanam19

Sorry I don't recall this error. Can you check if the dataset is complete, and the path to your dataset is correct? Can you print the filename to see which sample caused the error?

qianqianwang68 avatar Oct 20 '21 17:10 qianqianwang68

Screenshot from 2021-10-14 22-40-19 Hi,

I was trying to run the training from scratch and I am encountering this error. Do you know any particular reasons for this error. I am using batch_size of 1 and GPU of 4GB

I encounter the same situation. In megadepth.py the function __getitem__, normally it will return a dict with 9 elements. But sometimes it will return None, which cause the error. The open-source code uses batchsize=6, and this datasets cannot happen the situation that 6 pairs return None at the same time, so it can't recall this error.

aruba01 avatar Apr 11 '22 11:04 aruba01

thank you so much @aruba01! That makes a lot of sense. I think an easy fix to this issue could be that whenever the batch is None, we reset it to be a dummy batch, and in the training loop, we simply skip dummy batches whenever we detect them. I think this will resolve this issue and I will try to push an update soon.

qianqianwang68 avatar Apr 14 '22 20:04 qianqianwang68