dan_the_3rd

Results 191 comments of dan_the_3rd

Hi, We have some instructions in the readme to build from source https://github.com/facebookresearch/xformers#installing-xformers

What is the error you have? Have you tried setting `MAX_JOBS=2` for instance?

Hi, Have you tried setting `TORCH_CUDA_ARCH_LIST=8.7` ? This will only build the kernels for your architecture, and it should be much faster. You can also add `MAX_JOBS=8` as well to...

Hi, This is not supported, as we require torch > 1.12 at the moment (and most likely 2.0 at some point soon). However you can try to install from source...

Hi, We don't support pytorch 1.9 with any recent release. We have very old versions of xFormers that support 1.9, but they won't have any of the functionality that you...

Hi, It's possible that you are using a newer version of PyTorch, which already includes the xFormers efficient attention. In that case, xFormers is not necessary to get the best...

From what you posted, it looks like xFormers is correctly setup. Maybe Automatic1111 is not using it properly, or not detecting it - I would open an issue on the...

Hi, This is breaking 3rd C++ libraries which include ``, like in xFormers: https://github.com/facebookresearch/xformers/blob/fad50d49834ab18dd137acc727bd4d567ff17842/xformers/csrc/boxing_unboxing.cpp#L8 Failure: ``` In file included from /ghrunner/shared/tmp/8878728050/lib/python3.11/site-packages/torch/include/torch/csrc/distributed/c10d/Backend.hpp:11, from /ghrunner/shared/tmp/8878728050/lib/python3.11/site-packages/torch/include/torch/csrc/distributed/c10d/ProcessGroup.hpp:3, from /ghrunner/_work/xformers/xformers/xformers/csrc/boxing_unboxing.cpp:8: /ghrunner/shared/tmp/8878728050/lib/python3.11/site-packages/torch/include/torch/csrc/distributed/c10d/Utils.hpp:7:10: fatal error: fmt/format.h: No...

Hi, If you want deterministic (reproducible) results, you need to enable it in PyTorch: https://pytorch.org/docs/stable/generated/torch.use_deterministic_algorithms.html

> And CUTLASS said fused multi head attention examples is same as flash attention-2. I believe those are not the same thing. Where did you see that? Flash-Attention 2 is...