LLaVA icon indicating copy to clipboard operation
LLaVA copied to clipboard

[Question] Error during this command: pip install flash-attn --no-build-isolation

Open Ahnhojin1223 opened this issue 1 year ago • 7 comments

Question

Command: pip install flash-attn --no-build-isolation

Log:

Collecting flash-attn
  Downloading flash_attn-2.5.6.tar.gz (2.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 11.4 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [12 lines of output]
      fatal: not a git repository (or any of the parent directories): .git
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/tmp/pip-install-9u5e9dng/flash-attn_e362cbbd46404df8a4978593d8bb899c/setup.py", line 114, in <module>
          raise RuntimeError(
      RuntimeError: FlashAttention is only supported on CUDA 11.6 and above.  Note: make sure nvcc has a supported version by running nvcc -V.
      
      
      torch.__version__  = 2.1.2+cu121
      
      
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

Enviroment: Ubuntu 20.04

My Cuda version is: Cuda compilation tools, release 10.1, V10.1.243

How can I do to install flash-attn????

1

2

3

Ahnhojin1223 avatar Apr 04 '24 13:04 Ahnhojin1223

same question

iFe1er avatar Apr 10 '24 17:04 iFe1er

Same question! If you find any solution, please tell me! Thank you very much in advance!

FuZening avatar Apr 17 '24 08:04 FuZening

try to install the version of flash-attention compiled with torch which you can download from https://github.com/Dao-AILab/flash-attention/releases/tag/v2.5.8 will be useful

JonathonYan1993 avatar Apr 29 '24 06:04 JonathonYan1993

pip install flash-attn==2.5.5 --no-build-isolation

zs-zhong avatar May 06 '24 15:05 zs-zhong