ViT-Adapter
ViT-Adapter copied to clipboard
Video memory occupancy while training
Hi! I am interested in your work! I am training your UperNet+ViT_adapter_large on ade20k, but I found that while training the model, the cost of video memory is gradually increasing. In detail, I train it on 4*A100 with batch size 4 pre gpu. At first, it costs 16GB per gpu, but after about 60k iters, it increases to 44GB per gpu. I wanna know if it is normal. Looking forward to your reply!
Hi, you can check nvidia-smi for double confirmation. If it is consistent with or close to the memory cost shown in the log, then it is normal.