custom-diffusion
custom-diffusion copied to clipboard
The results I change the data are not good
The results I reproduced with the data in the paper are good
my input images:
my result:
But I change other pictures, the result is worse(the sunglasses is not like my input sunglasses)
my input images:
my result:
My training parameters are as follows:(I use the pretrained_model: runwayml/stable-diffusion-v1-5)
!accelerate launch src/diffuser_training.py
--pretrained_model_name_or_path=$MODEL_NAME
--output_dir=$OUTPUT_DIR
--concepts_list="/content/concepts_list.json"
--with_prior_preservation --prior_loss_weight=1.0
--resolution=512
--train_batch_size=2
--learning_rate=1e-5
--lr_warmup_steps=0
--max_train_steps=500
--num_class_images=200
--scale_lr --hflip
--modifier_token "<whxm>+<bkmj>"
My concept_list.json are as follows:
Is there something wrong with my training process causing this poor result? Thanks!
Hi, can you post if the individual sunglass generations with the fine-tuned model are similar to the target images? If that is not the case, maybe training longer with a lower learning rate can help.
BTW, where to download the training data in the paper?