LLMZoo icon indicating copy to clipboard operation
LLMZoo copied to clipboard

How to fix this issues "Failed to build flash-attn ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects"

Open LiZhangMing opened this issue 1 year ago • 1 comments

When I used the "pip install -r requirements.txt", then I got the following probem:

" File "/tmp/pip-build-env-4dcxq614/overlay/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1909, in _run_ninja_build raise RuntimeError(message) from e RuntimeError: Error compiling objects for extension [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Failed to build flash-attn ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects"

Note: environment

Name Version Build Channel _libgcc_mutex 0.1 main _openmp_mutex 5.1 1_gnu binutils_impl_linux-64 2.38 h2a08ee3_1 bzip2 1.0.8 h7b6447c_0 ca-certificates 2023.5.7 hbcca054_0 conda-forge cuda-nvcc 11.7.64 0 nvidia/label/cuda-11.7.0 gcc 11.2.0 h702ea55_10 conda-forge gcc_impl_linux-64 11.2.0 h1234567_1 gxx 11.2.0 h702ea55_10 conda-forge gxx_impl_linux-64 11.2.0 h1234567_1 kernel-headers_linux-64 2.6.32 he073ed8_15 conda-forge ld_impl_linux-64 2.38 h1181459_1 libedit 3.1.20191231 he28a2e2_2 conda-forge libffi 3.3 h58526e2_2 conda-forge libgcc-devel_linux-64 11.2.0 h1234567_1 libgcc-ng 11.2.0 h1234567_1 libgomp 11.2.0 h1234567_1 libstdcxx-devel_linux-64 11.2.0 h1234567_1 libstdcxx-ng 11.2.0 h1234567_1 libuuid 1.41.5 h5eee18b_0 ncurses 6.4 h6a678d5_0 openssl 1.1.1o h166bdaf_0 conda-forge pip 23.1.2 pyhd8ed1ab_0 conda-forge python 3.9.0 hdb3f193_2 readline 8.1 h46c0cb4_0 conda-forge setuptools 67.7.2 pyhd8ed1ab_0 conda-forge sqlite 3.41.2 h5eee18b_0 sysroot_linux-64 2.12 he073ed8_15 conda-forge tk 8.6.12 h1ccaba5_0 tzdata 2023c h04d1e81_0 wheel 0.40.0 pyhd8ed1ab_0 conda-forge xz 5.4.2 h5eee18b_0 zlib 1.2.13 h5eee18b_0

LiZhangMing avatar May 19 '23 09:05 LiZhangMing

Dear @LiZhangMing,

Thanks! For the issue about Flast-Attn, please refer to this repo.

Alternatively, we can use train.py rather than train_fast.py to turn off the flash attention.

Best, Zhihong

zhjohnchan avatar Jun 04 '23 11:06 zhjohnchan