alpaca-lora icon indicating copy to clipboard operation
alpaca-lora copied to clipboard

What is micro_batch_size?

Open weiddeng opened this issue 1 year ago • 2 comments

A stupid question, I think I know what batch_size is, but what is micro_batch_size and what it is for? Thanks!

As in

python finetune.py \
    --base_model 'decapoda-research/llama-7b-hf' \
    --data_path 'yahma/alpaca-cleaned' \
    --output_dir './lora-alpaca' \
    --batch_size 128 \
    --micro_batch_size 4

weiddeng avatar Apr 26 '23 21:04 weiddeng

If you are running on colab you must take it as 4 , else i think on higher GPUs you can go upto 8. I tried to do this on colab pro and used 4 else i was getting OOM error,

Risingabhi avatar Apr 27 '23 06:04 Risingabhi

I think it depends. I tried 16, but it worked.

StarNJey avatar May 03 '23 08:05 StarNJey

gradient_accumulation_steps = batch_size // micro_batch_size

weiddeng avatar May 21 '23 22:05 weiddeng