lora icon indicating copy to clipboard operation
lora copied to clipboard

Why the saved weight is 700k instead of 3M?

Open Allen-lz opened this issue 2 years ago • 1 comments

I used the following command for fine tuning ` export MODEL_NAME="runwayml/stable-diffusion-v1-5" export INSTANCE_DIR="./data/data_disney" export OUTPUT_DIR="./exps/output_dsn"

lora_pti
--pretrained_model_name_or_path=$MODEL_NAME
--instance_data_dir=$INSTANCE_DIR
--output_dir=$OUTPUT_DIR
--train_text_encoder
--resolution=512
--train_batch_size=1
--gradient_accumulation_steps=4
--scale_lr
--learning_rate_unet=1e-4
--learning_rate_text=1e-5
--learning_rate_ti=5e-4
--color_jitter
--lr_scheduler="linear"
--lr_warmup_steps=0
--placeholder_tokens="|"
--use_template="style"
--save_steps=100
--max_train_steps_ti=1000
--max_train_steps_tuning=1000
--perform_inversion=True
--clip_ti_decay
--weight_decay_ti=0.000
--weight_decay_lora=0.001
--continue_inversion
--continue_inversion_lr=1e-4
--device="cuda:0"
--lora_rank=1
--use_face_segmentation_condition
` But the saved weight is only 700k

Allen-lz avatar Feb 14 '23 14:02 Allen-lz

That's because your lora_rank is 1 here. Increasing it will increase the size. A rank of 4 will roughly give ~3MB.

rishabhjain avatar Feb 14 '23 16:02 rishabhjain

That's because your lora_rank is 1 here. Increasing it will increase the size. A rank of 4 will roughly give ~3MB.

thanks!

Allen-lz avatar Feb 20 '23 15:02 Allen-lz