flash-attention
flash-attention copied to clipboard
linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv
ENV
- torch 2.1.2
- flash-attn 2.5.8
- cuda 11.7
ERROR
flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv
I got the same error when I try to using 2.5.8
, and I solved this by temporarily rolling back to version 2.5.7
of flash_attn
.
pip install flash_attn==2.5.7
Hope that helps you.
I got the same error when I try to using
2.5.8
, and I solved this by temporarily rolling back to version2.5.7
offlash_attn
.pip install flash_attn==2.5.7
Hope that helps you.
I use pip install flash_attn==2.5.7, then: Looking in indexes: https://mirrors.aliyun.com/pypi/simple/ Collecting flash_attn==2.5.7 Using cached https://mirrors.aliyun.com/pypi/packages/21/cb/33a1f833ac4742c8adba063715bf769831f96d99dbbbb4be1b197b637872/flash_attn-2.5.7.tar.gz (2.5 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [17 lines of output]
Traceback (most recent call last):
File "/data/ComfyUI/venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
Check 4.. I solved this issue adding the --no-build-isolation
flag!
Check 4.. I solved this issue adding the
--no-build-isolation
flag!
thanks, I use command pip install flash-attn --no-build-isolation solve the problem, too.
Thanks. Python 3.9.19,cuda12.2,torch2.3.0 flash_attn==2.5.8, it works.