Simo Ryu
Simo Ryu
Right gradient accumulation doesn't work now because it implicitly updates all other params wrapped inside it. So i removed it.
Yes I think so. But Im not really used to accelerate package so it was probably wasn't the way to fix it. I'll try to make it work with grad...
Quick question, was this be compatible with, for example, CLIs and so on?
Can you change this to dev branch? Thanks
Sure thats no problem. I also think that we can update them one at a time, iteratively. This will get the best of both worlds: training both text, unet +...
I think this is crucial. Hold on I am making them.
Oh wow this is great @Thomas-MMJ ! Thank you so much!
So these can be used freely right? I will add these into the repo
Nice!! ok thank you @Thomas-MMJ !!
This is great, I am currently refactoring the training script, I will use this trick in it