MiniGPT-4
MiniGPT-4 copied to clipboard
How many data exactly are used in stage-2 fintuning in MiniGPT-4 v1? I'm confused.
In train_configs/minigpt4_llama2_stage2_finetune.yaml :
iters_per_epoch: 200
batch_size_train: 12
accum_grad_iters: 6
Since the code uses a MultiIterDataloader
which turns the data loader into a infinite loop.
Because my 4090 can't fit MiniGPT4 stage2 finetune, I'm changing the batch size to 2.
Does that mean I'm using 200x2x5=2000 data points only? How do I change accum_grad_iters accordingly?
from runner/runner_base.py:
for cur_epoch in range(self.start_epoch, self.max_epoch):
# training phase
if not self.evaluate_only:
logging.info("Start training")
from tasks/base_task.py:
for i in metric_logger.log_every(range(iters_per_epoch), log_freq, header):
# if using iter-based runner, we stop after iters_per_epoch iterations.
if i >= iters_per_epoch:
break
samples = next(data_loader)
Same questions @junchen14