SAM-Adapter-PyTorch
SAM-Adapter-PyTorch copied to clipboard
when I use the model I trained , the cuda will out of memory, but if I use the pretained model it`s fine
Hi , I have some problem when I try to use the model I trained. The first question is that , I use the config of vit-h and my GPU is rtx A6000 48g, when I train the model with the vit_h pretrained model , there are no error. But after trained ,when I want to use the model I trained to run test.py , the cuda will out of memory. I aready try batch_size=1 but the error still happen. The another question is when I try to use 2 gpu to train or test the model whit the .pth I saved , the first GPU will run all local_rank process , and this condition will make the cuda out of memory.