flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

ModuleNotFoundError: No module named 'flash_attn_2_cuda'

Open Hansyvea opened this issue 1 year ago • 8 comments

Python 3.10.13 (main, Sep 11 2023, 13:44:35) [GCC 11.2.0] on linux Type "help", "copyright", "credits" or "license" for more information.

import flash_attn Traceback (most recent call last): File "", line 1, in File "/home/xxx/githubRepo/flash-attention/flash_attn/init.py", line 3, in from flash_attn.flash_attn_interface import ( File "/home/xxx/githubRepo/flash-attention/flash_attn/flash_attn_interface.py", line 10, in import flash_attn_2_cuda as flash_attn_cuda ModuleNotFoundError: No module named 'flash_attn_2_cuda'

env: cuda_version:12.2 torch:2.1

Why would this happen?

Hansyvea avatar Apr 30 '24 05:04 Hansyvea

hi bro! did you solve it?

yqin-falling-stars avatar May 01 '24 11:05 yqin-falling-stars

me too,did you solve it?

coderchem avatar May 14 '24 03:05 coderchem

same error.

rantianhua avatar May 14 '24 15:05 rantianhua

still getting this error

saurabh-kataria avatar Sep 06 '24 03:09 saurabh-kataria