Suraj Patil
Suraj Patil
I will start testing this and will keep you posted.
Thanks a lot for the issue! - Same comment as Patrick's for the attention weights - Is the scheduler significantly different from the `DDIMScheduler` ? If there is an example...
Okay if I understand it correctly, the only difference is the `step`, which is the inversion of the actual `step`., so it's not really a new scheduler. Wonder if we...
If I understand correctly, what you want is for the model to be able to generate images in the style of the training dataset after fine-tuning. I think you should...
I think you'll have to experiment with hyperparameters, like different learning rates, number of epochs, number of training images etc. Also, it would be nice if you could post the...
Hard to say what the issue is, the command looks good to me. As I said above, think you'll need to play a bit with different hyperparameters.
Same comment as Patrick's would be nice if you could post something so we could re-produce.
From the linked issue, it seems like they are using `bfloat16` for training; not sure if `bfloat16` works well with PyTorch stable diffusion. I'm using `xformers==0.0.16rc396` and it's working well.
I couldn't reproduce the issue. It's working fine for me. I tried the same command as you, with the same dependencies version (accelerate, xformers, triton) and with `torch==1.13.1`, `diffusers` main,...
No worries at all, maybe try in colab or setup a fesh env, that would help with env issue.