flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

Open rjmehta1993 opened this issue 9 months ago • 10 comments

installing all from scratch.

torch: 2.3.0 flash-attn: 2.5.7 exllama: 0.0.19

Still getting the error:

flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

rjmehta1993 avatar Apr 26 '24 17:04 rjmehta1993

Flash attention 2.5.8 has been released. It might resolve the issue.

nzw0301 avatar Apr 27 '24 02:04 nzw0301

Yep, we have new wheels compiled for pytorch 2.3.0

tridao avatar Apr 27 '24 03:04 tridao

torch: 2.2.0 flash-attn: 2.5.8

Getting the same error.

ShinnosukeUesaka avatar Apr 27 '24 18:04 ShinnosukeUesaka

Wheels are built for torch 2.2.2 and torch 2.3.0. Looks like it's not compatible with 2.2.0. You can try previous version of flash-attn, or build from scratch.

tridao avatar Apr 27 '24 22:04 tridao

building from scratch resolved the error for me

ckgresla avatar May 02 '24 17:05 ckgresla

@tridao still the problem build from new release

foreverpiano avatar May 27 '24 13:05 foreverpiano

I also get the same error. Please help. Python 3.10, Cuda 12.1, Ubuntu 22.04

CyberTimon avatar May 30 '24 09:05 CyberTimon

torch: 2.2.0 flash-attn: 2.5.8

Getting the same error.

torch 2.3.0 + flash-attn 2.5.8 works fine to me

torch: 2.2.0 flash-attn: 2.5.8

Getting the same error.

rkuo2000 avatar Jun 05 '24 19:06 rkuo2000

torch 2.3.0 + flash-attn 2.5.8 works fine to me

I also get the same error. Please help. Python 3.10, Cuda 12.1, Ubuntu 22.04

torch 2.3.0 + flash-attn 2.5.8 works (same configuration as your python3.10, Cuda12.1, ubuntu 22.04)

rkuo2000 avatar Jun 05 '24 19:06 rkuo2000

Thanks. Python 3.9.19,cuda12.2,torch2.3.0 flash_attn==2.5.8, it works.

Bellocccc avatar Jul 26 '24 04:07 Bellocccc