ViT-Adapter
ViT-Adapter copied to clipboard
Inference in batches and multiple GPU
- I am trying to use infer model on multiple GPU but I get an error of unathurised access to GPU.Do I need to config the repo accordingly / how to use cuda visible devices ?
- Can I infer with images batches?
I did inference on multiple GPUs on vast.ai and I didn't have problems. The only things I did was to give privileges to dist_test.sh. You can try either chmod +x dist_test.sh or chmod 777 dist_test.sh