flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv

Open zui-jiang opened this issue 9 months ago • 5 comments

ENV

  • torch 2.1.2
  • flash-attn 2.5.8
  • cuda 11.7

ERROR flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv

zui-jiang avatar May 14 '24 04:05 zui-jiang

I got the same error when I try to using 2.5.8, and I solved this by temporarily rolling back to version 2.5.7 of flash_attn.

pip install flash_attn==2.5.7

Hope that helps you.

Heartfirey avatar May 18 '24 14:05 Heartfirey

I got the same error when I try to using 2.5.8, and I solved this by temporarily rolling back to version 2.5.7 of flash_attn.

pip install flash_attn==2.5.7

Hope that helps you.

I use pip install flash_attn==2.5.7, then: Looking in indexes: https://mirrors.aliyun.com/pypi/simple/ Collecting flash_attn==2.5.7 Using cached https://mirrors.aliyun.com/pypi/packages/21/cb/33a1f833ac4742c8adba063715bf769831f96d99dbbbb4be1b197b637872/flash_attn-2.5.7.tar.gz (2.5 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> [17 lines of output] Traceback (most recent call last): File "/data/ComfyUI/venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in main() File "/data/ComfyUI/venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main json_out['return_val'] = hook(**hook_input['kwargs']) File "/data/ComfyUI/venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel return hook(config_settings) File "/tmp/pip-build-env-71t3p8gt/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 327, in get_requires_for_build_wheel return self._get_build_requires(config_settings, requirements=[]) File "/tmp/pip-build-env-71t3p8gt/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 297, in _get_build_requires self.run_setup() File "/tmp/pip-build-env-71t3p8gt/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 497, in run_setup super().run_setup(setup_script=setup_script) File "/tmp/pip-build-env-71t3p8gt/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 313, in run_setup exec(code, locals()) File "", line 19, in ModuleNotFoundError: No module named 'torch' [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

wangwenqiao666 avatar Jul 18 '24 09:07 wangwenqiao666

Check 4.. I solved this issue adding the --no-build-isolation flag!

TJ-Solergibert avatar Jul 18 '24 09:07 TJ-Solergibert

Check 4.. I solved this issue adding the --no-build-isolation flag!

thanks, I use command pip install flash-attn --no-build-isolation solve the problem, too.

wangwenqiao666 avatar Jul 18 '24 09:07 wangwenqiao666

Thanks. Python 3.9.19,cuda12.2,torch2.3.0 flash_attn==2.5.8, it works.

Bellocccc avatar Jul 26 '24 04:07 Bellocccc