twtynije
twtynije
I use pip install flash_attn-2.7.0.post2%2Bcu124torch2.4.0cxx11abiFALSE-cp311-cp311-win_amd64.whl I want to know how to call it correctly—should I use: `from flash_attn import flash_attn_func` or `from torch.nn.functional import scaled_dot_product_attention` Additionally, I only installed the...
I'm using the same Flash Attention codebase (with identical naming/implementation) as yours, but my version runs on Windows, while yours likely operates in a Linux environment. **This is my test.ipynb**...
Alright, I really appreciate your assistance. I'm working on resolving this now. May I reach out to you again if I have additional questions later?