diffusers icon indicating copy to clipboard operation
diffusers copied to clipboard

Make xformers optional even if it is available

Open kn opened this issue 2 years ago • 4 comments

Problem: xformers being available doesn't always mean it should be used. For example, I install xformers because it makes inference fast but I find xformers make my dreambooth training slower.

Solution: Require explicit flag to enable xformers when running training scripts.

Related recent change: https://github.com/huggingface/diffusers/pull/1640

kn avatar Dec 19 '22 02:12 kn

The documentation is not available anymore as the PR was closed or merged.

cc @patil-suraj @pcuenca, I tend to agree here with @kn . If we make it fully optional we should however not wrap it into a "try .... else ..." statement, but just let it fail if it's not available

patrickvonplaten avatar Dec 19 '22 09:12 patrickvonplaten

good idea on raising an exception when xformers is used but not available! PTAL

kn avatar Dec 19 '22 13:12 kn

I'm in favor of this as it better follows the "no optimizations by default" API of PyTorch and should be "less surprising" for the user long-term.

This change might suddenly slow down some notebooks, so also kindly pinging @apolinario and @patil-suraj here.

patrickvonplaten avatar Dec 20 '22 00:12 patrickvonplaten

renamed the flag and added a note in READMEs - PTAL

kn avatar Dec 27 '22 18:12 kn

Thanks for the PR @kn !

patrickvonplaten avatar Jan 03 '23 12:01 patrickvonplaten