VideoMoCo
VideoMoCo copied to clipboard
OOM error when using more than 24 GPUs
It's a very intresting work, but I was wondering how many GPUs when you're training videoMoco? we found if use more than 24 GPUs(v100 32G), the training processing would be blocked by GPU OOM issue in "_batch_shuffle" function
Hi, we use 8GPUs(v100).