flash-attention
flash-attention copied to clipboard
windows wheel not found: https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.3.post1/flash_attn-2.4.3.post1+cu118torch1.11cxx11abiFALSE-cp38-cp38-win_amd64.whl
trafficstars
I am trying to install flash-attention for windows 11, but failed with message:
> pip install flash-attn --no-build-isolation
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting flash-attn
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/92/95/b6c58e6747917f6f49c7555dd4a6eb39d31015f43c9300c45a3920006a6e/flash_attn-2.4.3.post1.tar.gz (2.5 MB)
Preparing metadata (setup.py) ... done
Requirement already satisfied: torch in c:\users\nomore\anaconda3\envs\modelscope\lib\site-packages (from flash-attn) (1.11.0+cu113)
Requirement already satisfied: einops in c:\users\nomore\anaconda3\envs\modelscope\lib\site-packages (from flash-attn) (0.7.0)
Requirement already satisfied: packaging in c:\users\nomore\anaconda3\envs\modelscope\lib\site-packages (from flash-attn) (23.2)
Collecting ninja (from flash-attn)
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/b6/2f/a3bc50fa63fc4fe9348e15b53dc8c87febfd4e0c660fcf250c4b19a3aa3b/ninja-1.11.1.1-py2.py3-none-win_amd64.whl (312 kB)
Requirement already satisfied: typing-extensions in c:\users\nomore\anaconda3\envs\modelscope\lib\site-packages (from torch->flash-attn) (4.9.0)
Building wheels for collected packages: flash-attn
Building wheel for flash-attn (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [9 lines of output]
fatal: not a git repository (or any of the parent directories): .git
torch.__version__ = 1.11.0+cu113
running bdist_wheel
Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.3.post1/flash_attn-2.4.3.post1+cu118torch1.11cxx11abiFALSE-cp38-cp38-win_amd64.whl
error: <urlopen error [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for flash-attn
Running setup.py clean for flash-attn
Failed to build flash-attn
ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects
The url it try to get whl for win_amd64 is not exists
I also get a ModuleNotFoundError: No module named 'wheel' error when installing.
I am also getting the same error: ModuleNotFoundError: No module named 'wheel' while installing flash_attn-2.5.8 using pip install flash-attn command in windows 11.