flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

ImportError: cannot import name 'flash_attn_varlen_func_with_kvcache' from 'flash_attn' (/python3.10/site-packages/flash_attn/__init__.py)

Open foreverpiano opened this issue 1 year ago • 3 comments

>>> import flash_attn
>>> flash_attn.__version__
'2.5.0'
>>>from flash_attn import (flash_attn_func, flash_attn_varlen_func, flash_attn_varlen_func_with_kvcache, flash_attn_with_kvcache)
ImportError: cannot import name 'flash_attn_varlen_func_with_kvcache' from 'flash_attn' 

foreverpiano avatar Jan 26 '24 07:01 foreverpiano

As the error message says, there's no flash_attn_varlen_func_with_kvcache.

tridao avatar Jan 26 '24 08:01 tridao

is this a version issue? i try to downgrade flash_attn to 2.3.0 and it works. @tridao

foreverpiano avatar Jan 30 '24 08:01 foreverpiano

is this a version issue? i try to downgrade flash_attn to 2.3.0 and it works. @tridao

I also encountered this problem, when I tried run pytest -q -s tests/test_flash_attn.py just as https://github.com/Dao-AILab/flash-attention?tab=readme-ov-file#tests.

I found that this error was caused by a version mismatch issue. @tridao My source code (which is downloaded with git clone) is flash_attn v2.5.7. But in system (which is already installed in pytorch docker, and located at /python3.10/site-packages/flash_attn/init.py) is flash_attn v2.0.4. and I only found flash_attn_with_kvcache in flash_attn v2.5.7. there is not found flash_attn_with_kvcache in flash_attn v2.0.4.

Here are two recommended solutions:

solutions 1: Lower the version of source code to v2.0.4.. cd your flash_attn source code git checkout v2.0.4. then run pytest -q -s tests/test_flash_attn.py

solutions 2: Upgrade the version in the system to v2.5.7. pip uninstall flash_attn cd your flash_attn source code python setup.py install then run pytest -q -s tests/test_flash_attn.py

shifang99 avatar Apr 12 '24 08:04 shifang99