Ragagnin
Results
3
comments of
Ragagnin
Im facing the same issue!
In my case, I'm running on Colab, and apparently there is incompatibility between CUDA and flash_attn: ``` !python -c "import flash_attn" !python -c "from flash_attn.flash_attention import FlashMHA" ``` ``` Traceback...
Hello, I'm having the same issue. I'm using CUDA 11.7.0, torch 2.0.1+cu117, and flash-attn version 1.0.4. I thought it was a personal issue, depending on the GPU I was using....