flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

Is FlashAttention-3 incompatible with Windows compilation? #1551

Open digitalusman99 opened this issue 1 year ago • 1 comments

(1st3128) E:\anaconda3\envs\1st3128\tmp\flash-attention\hopper>python setup.py install

Submodule 'csrc/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path '../csrc/cutlass' Cloning into 'E:\anaconda3\envs\1st3128\tmp\flash-attention\csrc/cutlass'... Submodule path '../csrc/cutlass': checked out '62750a2b75c802660e4894434dc55e839f322277'

Detected PyTorch version: 2.6.0+cu126

Traceback (most recent call last): File "E:\anaconda3\envs\1st3128\tmp\flash-attention\hopper\setup.py", line 405, in download_and_copy( File "E:\anaconda3\envs\1st3128\tmp\flash-attention\hopper\setup.py", line 355, in download_and_copy url = url_func(supported[system], arch, version) KeyError: 'Windows'

digitalusman99 avatar Mar 25 '25 17:03 digitalusman99

flash attention 3 error : https://github.com/Dao-AILab/flash-attention/issues/1524

FurkanGozukara avatar Mar 26 '25 12:03 FurkanGozukara

Please check flash-attention\hopper\setup.py line 404 "if bare_metal_version != Version("12.8"):" Change 12.8 to the cuda version you installed

xlfkeer avatar Jul 19 '25 08:07 xlfkeer