flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

module 'flash_attn' has no attribute 'flash_attn_varlen_qkvpacked_func'

Open william-ngvn opened this issue 1 year ago • 4 comments

Hi TriDao, thanks for the great work.

I'm trying to train the project to apply flash-attention2. But my Graphics Card is RTX 2080 which is not yet supported flash-attention2, so I downloaded and applied flash-attention1 to train it.

•This is the error I received: File "/mnt/will1tb/work_space/Pointcept/exp/nusecenes/semseg-v3m1-0-base/code /pointcept /models /point_transformer_v3 / point_v3m1_base.py", line 208, in forward feature = flash_attn.flash_attn_varlen_qkvpacked_func( AttributeError: module 'flash_attn' has no attribute 'flash_attn_varlen_qkvpacked_func'

•In https://github.com/Dao-AILab/flash-attention/issues/745#issuecomment-1873780475 I followed and import from 'flash_attn.flash_attn_interface' but it returned: "SyntaxError: invalid syntax."

•So can you help me to reply some questions, please: 1, How do I change from flash-attention 2 to flash-attention 1 while the project is using flash-attention2? 2, Or could you suggest something to me in this case?

My Setup: • Graphics Card: 1 x RTX 2080 (11GB V-Ram) • CUDA: 11.8 • Pytorch: 2.2.0 Thank you so much.

william-ngvn avatar May 02 '24 08:05 william-ngvn

pip install flash-attn==1.0.9

tridao avatar May 02 '24 18:05 tridao

Thank you for your prompt reply. @tridao In my case, I am using Attention1 (because the Graphics Card is RTX 2080 only with Attention1 supported). But the article I want to test run is using Flash-Attention2. So I want to ask the author: Is there any way to change the code from Flash-Attention 2 back to Flash-Attention1? Or something like that. Suppose: Rename from flash_attn_varlen_func -> flash_attn_unpadded_func,...

william-ngvn avatar May 03 '24 07:05 william-ngvn

sure you can try that

tridao avatar May 03 '24 16:05 tridao

I appreciate your help, bro!!!

william-ngvn avatar May 07 '24 02:05 william-ngvn