Is FlashAttention-3 incompatible with Windows compilation? #1551
(1st3128) E:\anaconda3\envs\1st3128\tmp\flash-attention\hopper>python setup.py install
Submodule 'csrc/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path '../csrc/cutlass' Cloning into 'E:\anaconda3\envs\1st3128\tmp\flash-attention\csrc/cutlass'... Submodule path '../csrc/cutlass': checked out '62750a2b75c802660e4894434dc55e839f322277'
Detected PyTorch version: 2.6.0+cu126
Traceback (most recent call last):
File "E:\anaconda3\envs\1st3128\tmp\flash-attention\hopper\setup.py", line 405, in
flash attention 3 error : https://github.com/Dao-AILab/flash-attention/issues/1524
Please check flash-attention\hopper\setup.py line 404 "if bare_metal_version != Version("12.8"):" Change 12.8 to the cuda version you installed