flash-attention icon indicating copy to clipboard operation
flash-attention copied to clipboard

from flash_attn.layers.rotary import RotaryEmbedding

Open Carol-gutianle opened this issue 1 year ago • 19 comments

When I just import this module, I encounter this:
File "test.py", line 1, in from flash_attn.layers.rotary import RotaryEmbedding File "/mnt/petrelfs/gutianle/miniconda3/lib/python3.8/site-packages/flash_attn/layers/rotary.py", line 10, in import rotary_emb ModuleNotFoundError: No module named 'rotary_emb'

Carol-gutianle avatar Apr 09 '23 07:04 Carol-gutianle

Traceback (most recent call last): File "test.py", line 1, in from flash_attn.flash_attention import FlashAttention File "/mnt/petrelfs/gutianle/miniconda3/lib/python3.8/site-packages/flash_attn/flash_attention.py", line 7, in from flash_attn.flash_attn_interface import flash_attn_unpadded_qkvpacked_func File "/mnt/petrelfs/gutianle/miniconda3/lib/python3.8/site-packages/flash_attn/flash_attn_interface.py", line 5, in import flash_attn_cuda ImportError: /mnt/petrelfs/gutianle/miniconda3/lib/python3.8/site-packages/flash_attn_cuda.cpython-38-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda20CUDACachingAllocator9allocatorE

Carol-gutianle avatar Apr 09 '23 07:04 Carol-gutianle

Ok, I have solved problems above. For the first problem, I forget to install rotary from its directory. For the second problem, I check my cuda and torch-cuda version and reinstall it.

Carol-gutianle avatar Apr 10 '23 05:04 Carol-gutianle

I found that if you reinstall a different version of CUDA, then flash-attn should also be reinstalled.

aimetrics avatar Apr 18 '23 06:04 aimetrics

Ok, I have solved problems above. For the first problem, I forget to install rotary from its directory. For the second problem, I check my cuda and torch-cuda version and reinstall it.

Hi Carol, can I know how did you install rotary? I tried pip install rotary-embedding-torch and even downloaded flash-attention git project and installed manually, but neither worked.

Thanks

kenshinsee avatar May 03 '23 09:05 kenshinsee

I resolved it, I specifically ran the setup.py under csrc/rotary, rather than running the setup.py on the root directory of flash-attention.

kenshinsee avatar May 03 '23 09:05 kenshinsee

Ok, I have solved problems above. For the first problem, I forget to install rotary from its directory. For the second problem, I check my cuda and torch-cuda version and reinstall it.

hi, i got the same problem like second problem, could you tell me you pytorch version and cuda version,my version as follow: cuda 11.7 pytorch 1.13.1+cu117

aaahuia avatar May 10 '23 09:05 aaahuia

my version is cuda 11.6 pytorch 1.13.1+cuda116

Carol-gutianle avatar May 10 '23 09:05 Carol-gutianle

my version is cuda 11.6 pytorch 1.13.1+cuda116

could you tell me gcc version, i think the gcc version is also important. my another verison is also 11.6 ,but also got the same problem. in another issue, the user also change the gcc version, thanks

aaahuia avatar May 10 '23 09:05 aaahuia

10.2.0

Carol-gutianle avatar May 10 '23 09:05 Carol-gutianle

Thanks. It worked for me. (my version is cuda 11.7)

pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu117
pip uninstall flash-attn
pip install flash-attn --no-cache-dir --no-build-isolation

gugeonmo avatar Jul 07 '23 01:07 gugeonmo

I resolved it, I specifically ran the setup.py under csrc/rotary, rather than running the setup.py on the root directory of flash-attention.

Hi, could you tell me which gcc version you have?

BaiMeiyingxue avatar Jul 18 '23 12:07 BaiMeiyingxue

my version is cuda 11.6 pytorch 1.13.1+cuda116

could you tell me gcc version, i think the gcc version is also important. my another verison is also 11.6 ,but also got the same problem. in another issue, the user also change the gcc version, thanks

Hi, have you fixed the error?

BaiMeiyingxue avatar Jul 18 '23 12:07 BaiMeiyingxue

I resolved it, I specifically ran the setup.py under csrc/rotary, rather than running the setup.py on the root directory of flash-attention.

git clone -b v1.0.8 https://github.com/Dao-AILab/flash-attention /src/flash-attention
cd /src/flash-attention && pip install . 
cd /src/flash-attention; find csrc/ -type d -exec sh -c 'cd {} && pip install . && cd ../../' \;

For me, I went into each directory sequentially and installed it again, which solved the problem now

luckyops avatar Oct 03 '23 11:10 luckyops

I install flash-attention by pip install flash-attn --no-build-isolation ,How can I install rotary_emb? otherwise comes error:

import rotary_emb
ModuleNotFoundError: No module named 'rotary_emb'

MrRace avatar Oct 10 '23 07:10 MrRace

cd csrc/rotary && python setup.py install

tridao avatar Oct 10 '23 07:10 tridao

cd csrc/rotary && python setup.py install @tridao That means even though I install flash-attn by pip, i have to download source code so that I can run cd csrc/rotary && python setup.py install to install rotary_emb ?

MrRace avatar Oct 10 '23 07:10 MrRace

rotary_emb is not part of the flash attention package, you don't have to use it. You can also use pip, sth like pip install "git+https://github.com/Dao-AILab/flash-attention.git#subdirectory=csrc/rotary", which does the same thing (clone the repo and install from csrc/rotary) but maybe more convenient.

tridao avatar Oct 10 '23 08:10 tridao

pip install "git+https://github.com/Dao-AILab/flash-attention.git#subdirectory=csrc/rotary

Thanks a lot, pip install "git+https://github.com/Dao-AILab/flash-attention.git#subdirectory=csrc/rotary" work for me

MrRace avatar Oct 10 '23 08:10 MrRace

I solved the problem by installation the right release,such as flash_attn-2.2.0+cu116torch1.13cxx11abiFALSE-cp311-cp311-linux_x86_64.whl

Violet969 avatar Mar 20 '24 12:03 Violet969