dan_the_3rd

Results 190 comments of dan_the_3rd

Hi, We already have binaries pre-compiled for CUDA 12.1. You can install them with this command: ``` pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu121 ```

Hi, `memory_efficient_attention` used to be faster than PyTorch's SDPA because xFormers was using Flash-Attention. Now SDPA is also using Flash-Attention, so it's normal to have the same speed. Also the...

> we need xformers 0.0.28.post1 wheel Hi, This is available for windows but only with cuda 12.4. It's there: https://download.pytorch.org/whl/cu124/xformers/ You can install it with: ```bash pip3 install torch torchvision...

Hi, Is it still something you want to merge here? As stated before, this approach most likely will be hard to implement on H100s, and probably it would be more...

So this PR is not implementing anything new, but if you want to add support for other activation functions, probably we should discuss it. I believe this SwiGLU implementation has...

Hi, Thanks for the report. We intended to keep the version 12.4 for windows - but it looks like we don't have *any* version of xFormers uploaded to pytorch for...

Hi, Thanks for the report, we will have a look. In the meantime, you can use this commit and everything should work: https://github.com/facebookresearch/xformers/commit/a40ca6e4a9aeb2093d7a03c5ae2a9f1215f3c296 cc @lvaleriu

It also looks like xformers can't load the extensions because you have the following message: ``` WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for: PyTorch 2.4.0.dev20240602+cu124 with CUDA...