caps
caps copied to clipboard
Dataloader problem
Hi,
I was trying to run the training from scratch and I am encountering this error. Do you know any particular reasons for this error. I am using batch_size of 1 and GPU of 4GB
Hi, Can you please let me know the root cause of this error as I have tried many ways to resolve it but it reappears again and again
Sorry I don't recall this error. Can you check if the dataset is complete, and the path to your dataset is correct? Can you print the filename to see which sample caused the error?
Hi,
I was trying to run the training from scratch and I am encountering this error. Do you know any particular reasons for this error. I am using batch_size of 1 and GPU of 4GB
I encounter the same situation. In megadepth.py
the function __getitem__
, normally it will return a dict with 9 elements. But sometimes it will return None
, which cause the error. The open-source code uses batchsize=6
, and this datasets cannot happen the situation that 6 pairs return None at the same time, so it can't recall this error.
thank you so much @aruba01! That makes a lot of sense. I think an easy fix to this issue could be that whenever the batch is None, we reset it to be a dummy batch, and in the training loop, we simply skip dummy batches whenever we detect them. I think this will resolve this issue and I will try to push an update soon.