flash-attention
flash-attention copied to clipboard
flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
installing all from scratch.
torch: 2.3.0 flash-attn: 2.5.7 exllama: 0.0.19
Still getting the error:
flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
Flash attention 2.5.8 has been released. It might resolve the issue.
Yep, we have new wheels compiled for pytorch 2.3.0
torch: 2.2.0 flash-attn: 2.5.8
Getting the same error.
Wheels are built for torch 2.2.2 and torch 2.3.0. Looks like it's not compatible with 2.2.0. You can try previous version of flash-attn, or build from scratch.
building from scratch resolved the error for me
@tridao still the problem build from new release
I also get the same error. Please help. Python 3.10, Cuda 12.1, Ubuntu 22.04
torch: 2.2.0 flash-attn: 2.5.8
Getting the same error.
torch 2.3.0 + flash-attn 2.5.8 works fine to me
torch: 2.2.0 flash-attn: 2.5.8
Getting the same error.
torch 2.3.0 + flash-attn 2.5.8 works fine to me
I also get the same error. Please help. Python 3.10, Cuda 12.1, Ubuntu 22.04
torch 2.3.0 + flash-attn 2.5.8 works (same configuration as your python3.10, Cuda12.1, ubuntu 22.04)
Thanks. Python 3.9.19,cuda12.2,torch2.3.0 flash_attn==2.5.8, it works.