flash-attention
flash-attention copied to clipboard
ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects
Building wheels for collected packages: flash_attn Building wheel for flash_attn (setup.py) ... error error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [19 lines of output]
torch.__version__ = 2.2.1+cu121
/mnt/conda/envs/qwen/lib/python3.10/site-packages/setuptools/__init__.py:80: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated.
!!
********************************************************************************
Requirements should be satisfied by a PEP 517 installer.
If you are using pip, you can try `pip install --use-pep517`.
********************************************************************************
!!
dist.fetch_build_eggs(dist.setup_requires)
running bdist_wheel
Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
Raw wheel path /tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl
error: [Errno 18] Invalid cross-device link: 'flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl' -> '/tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash_attn Running setup.py clean for flash_attn Failed to build flash_attn ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects
Same, but for a custom build to run on AWS Linux.
torch.version = 2.3.0a0+git26431db
Everything else otherwise works, I just can't get exllamav2 to use flash_attn even if simply installing from pip (non-source install). Was hoping build from source would fix the issue.
Building wheels for collected packages: flash_attn Building wheel for flash_attn (setup.py) ... error error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [19 lines of output]
torch.__version__ = 2.2.1+cu121 /mnt/conda/envs/qwen/lib/python3.10/site-packages/setuptools/__init__.py:80: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. !! ******************************************************************************** Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try `pip install --use-pep517`. ******************************************************************************** !! dist.fetch_build_eggs(dist.setup_requires) running bdist_wheel Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl Raw wheel path /tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl error: [Errno 18] Invalid cross-device link: 'flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl' -> '/tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl' [end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash_attn Running setup.py clean for flash_attn Failed to build flash_attn ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects
Same error, temporally fix using shutil.move(wheel_filename, wheel_path) instead os.rename(src, dst) in setup.py as mentioned @CliuGeek9229 in https://github.com/Dao-AILab/flash-attention/issues/598#issuecomment-1784996156_