MiniGPT-4
MiniGPT-4 copied to clipboard
How to finetune with 3090*4 or 3090*8?
I want to finetune on 3090*4/8 (stage two), but there is not enough VRAM... Is there any way to perform finetuning in these circumstances?
I guess you can try to set a smaller batchsize in the training config file. The default batchsize there is 64 (per GPU), which consumes about 70G VRAM.
@TsuTikgiau If I set a smaller batchsize, should I increase iters_per_epoch for maintaining the performance? for example, batchsize=12, iters_per_epoch=200 => batchsize=2, iters_per_epoch=1200?