Dave Lage

Results 250 comments of Dave Lage

https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main Could try these on here. Don't use the scaled model for training. You only need the encoder from T5 XXL and the google version is both the encoder and...

Once it is officially available we can probably mention it, but it is still in WIP so wouldn't want to direct people to it just yet.

Were you able to get it to train? From what I was hearing on their repo it wasn't supporting autograd and thus not able to be trained with yet. https://github.com/thu-ml/SageAttention/issues/60

Flux reference implementation did not come with xformers so it is not there. No specific reason I believe. It could be added.

We can go through the different plugins for any invalid ones or archived. Archived might have people who are maintaining a fork but they can add their fork after? There...

For unmaintained maybe we can use our discretion but maybe 2 different maintainers can agree. Reviewing the issues and see if they have people asking about maintenance, and if the...

This was originally put in when Neovim was a bit less popular. Most plugins then would be Vim plugins that also worked on Neovim. Since then Lua has taken over...

Thanks @SamantazFox @syeopite for reviewing this. Very good insight and clarity. I will work to make a plan for executing a transition into these issues that will be transferable in...

Hard to say as it looks a little unstable, maybe indicating the hyperparameters are not working as well. The 5e-5 might be too high for model fine tuning but you...

My questions are of the requirements. Where was deepspeed coming from before? Is updating to 2.6.0 and having diffusers automatically update it a good idea with various different ways you...