MiniGPT-4
MiniGPT-4 copied to clipboard
Can I fine-tune minigpt-v2 on 4090?
I set the batch_size to 2 for training on 4*4090 gpus. But it still shows that 'Cuda out of memory'.
I have the same problem. I use Tesla V100 GPUs (each 32GB memory)
Hi @YushunXiang. I also try to use Tesla V100 GPUs to finetune the model but I come across a problem:
I don't think that's possible. From the terminal output I can see it requires 110 Gb VRAM to finetune MiniGPT-v2. I tried with 2 NVDIA A6000 (96Gb in total) but failed. If you read authors' paper you would know they used 4 NVIDIA A100 GPUs (320 Gb VRAM) in total. Clearly 4 4090 (24Gb*4) can't be sufficient.