MiniGPT-4 icon indicating copy to clipboard operation
MiniGPT-4 copied to clipboard

How to finetune with 3090*4 or 3090*8?

Open ZhengQinLai opened this issue 1 year ago • 2 comments

I want to finetune on 3090*4/8 (stage two), but there is not enough VRAM... Is there any way to perform finetuning in these circumstances?

ZhengQinLai avatar Apr 21 '23 03:04 ZhengQinLai

I guess you can try to set a smaller batchsize in the training config file. The default batchsize there is 64 (per GPU), which consumes about 70G VRAM.

TsuTikgiau avatar Apr 26 '23 14:04 TsuTikgiau

@TsuTikgiau If I set a smaller batchsize, should I increase iters_per_epoch for maintaining the performance? for example, batchsize=12, iters_per_epoch=200 => batchsize=2, iters_per_epoch=1200?

isruihu avatar Dec 26 '23 12:12 isruihu