alpaca-lora icon indicating copy to clipboard operation
alpaca-lora copied to clipboard

How to improve training efficiency and shorten training time

Open Tungsong opened this issue 1 year ago • 1 comments

I have two T4 on my machine, and I want to improve training efficiency, because it has enough memory when I use the default params

企业微信截图_16814425311604

I tried to update the batch_size to 256 but it doesn't seem to be working

Tungsong avatar Apr 14 '23 03:04 Tungsong

maybe deepspeed you can try

ChrisXULC avatar Apr 20 '23 06:04 ChrisXULC

you should increase micro_batch_size param @Tungsong

Tamminhdiep97 avatar Jul 10 '23 02:07 Tamminhdiep97