alpaca-lora icon indicating copy to clipboard operation
alpaca-lora copied to clipboard

One RTX 3090, 88 hours, is this normal?

Open kli-casia opened this issue 1 year ago • 3 comments

image

python finetune.py --base_model='decapoda-research/llama-7b-hf' --num_epochs=10 --cutoff_len=512 --group_by_length --output_dir='./lora-alpaca' --lora_target_modules='[q_proj,k_proj,v_proj,o_proj]' --lora_r=16 --micro_batch_size=8

kli-casia avatar May 06 '23 23:05 kli-casia

i get 118 hours on A4000 with the following command, so i guess its normal

python finetune.py --base_model='decapoda-research/llama-7b-hf' --num_epochs=10 --cutoff_len=512 --group_by_length --output_dir='./lora-alpaca' --lora_r=16 --micro_batch_size=8

andreae293 avatar May 07 '23 09:05 andreae293

i think epochs=10 means it will train with the same datasets ten times.

cxfcxf avatar May 07 '23 20:05 cxfcxf

I tried the code with two 3090 cards and the estimated end time is ~+35hours.

CZWin32768 avatar May 12 '23 09:05 CZWin32768