Luca Wehrstedt

Results 190 comments of Luca Wehrstedt

Could you submit a PR with your proposal? I'm not sure if I'm a big fan of all these checks and branches in that snippet: they do deliver a very...

Any specific reason you need the `dev` label? Could you use a stable version instead?

I don't know off the top of my head, but PyTorch 1.12 is quite ancient, I don't know if any of our current packages is still compatible with that. I'd...

BTW I ran that command and it looks like xFormers 0.0.22 should work for you.

The build failure seems to come from the CUTLASS submodule, not from xFormers itself. Are you able to build CUTLASS as standalone? If not, you should report this in their...

Use `git submodule update --init --recursive` to make sure the submodule pin didn't deviate from the main branch. If that still doesn't work, then it could be attributed to the...

Also, maybe double-check that you don't have an existing installation on CUTLASS on your system, corresponding to a different version, which might cause a mix of conflicting headers being included...

You can try to just comment out the `-std=c++17` option, since [PyTorch should be adding it on its own if missing](https://github.com/pytorch/pytorch/blob/57625baceacd0be1b6d16a57328de1a156512200/torch/utils/cpp_extension.py#L559-L566). Same with `-O3`: try replacing it with `-O2`. If...

We have some logic [here](https://github.com/facebookresearch/xformers/blob/f3a41aeca041217921ba836971ab3aa37923911d/setup.py#L488-L507) to avoid building FlashAttn2 if an existing compatible copy is provided by PyTorch. However we always unconditionally build FlashAttn3. If you could debug that logic...

What exactly are you asking for help for? The error message seems quite clear: you cannot pass float32 tensors to that operator on AMD GPUs. If you're invoking xFormers through...