MiniGPT-4 icon indicating copy to clipboard operation
MiniGPT-4 copied to clipboard

Can I fine-tune minigpt-v2 on 4090?

Open waltonfuture opened this issue 1 year ago • 3 comments

I set the batch_size to 2 for training on 4*4090 gpus. But it still shows that 'Cuda out of memory'.

waltonfuture avatar Dec 03 '23 05:12 waltonfuture

I have the same problem. I use Tesla V100 GPUs (each 32GB memory)

YushunXiang avatar Dec 06 '23 13:12 YushunXiang

Hi @YushunXiang. I also try to use Tesla V100 GPUs to finetune the model but I come across a problem: image

mengfeidu avatar Dec 19 '23 15:12 mengfeidu

I don't think that's possible. From the terminal output I can see it requires 110 Gb VRAM to finetune MiniGPT-v2. I tried with 2 NVDIA A6000 (96Gb in total) but failed. If you read authors' paper you would know they used 4 NVIDIA A100 GPUs (320 Gb VRAM) in total. Clearly 4 4090 (24Gb*4) can't be sufficient.

Crocodile-Chris avatar Jan 19 '24 00:01 Crocodile-Chris