flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects

Open lvsh2012 opened this issue 4 months ago • 2 comments

Building wheels for collected packages: flash_attn Building wheel for flash_attn (setup.py) ... error error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [19 lines of output]

  torch.__version__  = 2.2.1+cu121


  /mnt/conda/envs/qwen/lib/python3.10/site-packages/setuptools/__init__.py:80: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated.
  !!

          ********************************************************************************
          Requirements should be satisfied by a PEP 517 installer.
          If you are using pip, you can try `pip install --use-pep517`.
          ********************************************************************************

  !!
    dist.fetch_build_eggs(dist.setup_requires)
  running bdist_wheel
  Guessing wheel URL:  https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
  Raw wheel path /tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl
  error: [Errno 18] Invalid cross-device link: 'flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl' -> '/tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl'
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash_attn Running setup.py clean for flash_attn Failed to build flash_attn ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects

lvsh2012 avatar Mar 07 '24 02:03 lvsh2012