flash-attention
flash-attention copied to clipboard
The cuda version is 11.6. Why does this error still appear during installation?
(llm) (base) hong.h@hit-airline-utah-cn5-hong-h-master-0:~$ nvcc -V nvcc: NVIDIA (R) Cuda compiler driver Copyright (c) 2005-2022 NVIDIA Corporation Built on Tue_Mar__8_18:18:20_PST_2022 Cuda compilation tools, release 11.6, V11.6.124 Build cuda_11.6.r11.6/compiler.31057947_0 (llm) (base) hong.h@hit-airline-utah-cn5-hong-h-master-0:~$ python -c "import torch; print(torch.version)"
1.13.0 (llm) (base) hong.h@hit-airline-utah-cn5-hong-h-master-0:~$ (llm) (base) hong.h@hit-airline-utah-cn5-hong-h-master-0:~$ pip show Ninja Name: ninja Version: 1.11.1.1 Summary: Ninja is a small build system with a focus on speed Home-page: http://ninja-build.org/ Author: Jean-Christophe Fillion-Robin Author-email: [email protected] License: Apache 2.0 Location: /home/hong.h/miniconda3/envs/llm/lib/python3.9/site-packages Requires: Required-by: deepspeed (llm) (base) hong.h@hit-airline-utah-cn5-hong-h-master-0:~$ pip install flash-attn --no-build-isolation Looking in indexes: https://artifactory.nioint.com/artifactory/api/pypi/dd-pypi-all-virtual/simple, https://artifactory.nioint.com/artifactory/api/pypi/dd-pypi-all-virtual/simple Collecting flash-attn Downloading https://artifactory.nioint.com/artifactory/api/pypi/dd-pypi-all-virtual/packages/packages/4e/7b/b42299ad0edf4d185c883dcf83d4f31bf577e8e86f20e70b8ea8ebc0d75d/flash_attn-2.5.4.tar.gz (2.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.5/2.5 MB 7.7 MB/s eta 0:00:00 Preparing metadata (setup.py) ... error error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [13 lines of output]
fatal: not a git repository (or any parent up to mount point /)
Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
Traceback (most recent call last):
File "
torch.__version__ = 1.13.0
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed
× Encountered error while generating package metadata. ╰─> See above for output.
note: This is an issue with the package mentioned above, not pip. hint: See above for details.
Same problem! Are you solved it?