diffusers
                                
                                 diffusers copied to clipboard
                                
                                    diffusers copied to clipboard
                            
                            
                            
                        Make xformers optional even if it is available
Problem: xformers being available doesn't always mean it should be used. For example, I install xformers because it makes inference fast but I find xformers make my dreambooth training slower.
Solution: Require explicit flag to enable xformers when running training scripts.
Related recent change: https://github.com/huggingface/diffusers/pull/1640
The documentation is not available anymore as the PR was closed or merged.
cc @patil-suraj @pcuenca, I tend to agree here with @kn . If we make it fully optional we should however not wrap it into a "try .... else ..." statement, but just let it fail if it's not available
good idea on raising an exception when xformers is used but not available! PTAL
I'm in favor of this as it better follows the "no optimizations by default" API of PyTorch and should be "less surprising" for the user long-term.
This change might suddenly slow down some notebooks, so also kindly pinging @apolinario and @patil-suraj here.
renamed the flag and added a note in READMEs - PTAL
Thanks for the PR @kn !