bevfusion icon indicating copy to clipboard operation
bevfusion copied to clipboard

Conda Environment:ModuleNotFoundError: No module named 'flash_attn'

Open Li-Whasaka opened this issue 1 year ago • 6 comments

When I do a test ,there is an error here, but I can't find the reason. test command: torchpack dist-run -np 1 python tools/test.py configs/nuscenes/det/centerhead/lssfpn/camera/256x704/swint/default.yaml pretrained/swint-nuimages-pretrained.pth --eval bbox --out box.pkl and I got this error:

Traceback (most recent call last): File "tools/test.py", line 16, in from mmdet3d.models import build_model File "/home/lwx/bev/bevfusion/mmdet3d/models/init.py", line 1, in from .backbones import * File "/home/lwx/bev/bevfusion/mmdet3d/models/backbones/init.py", line 9, in from .radar_encoder import * File "/home/lwx/bev/bevfusion/mmdet3d/models/backbones/radar_encoder.py", line 18, in from flash_attn.flash_attention import FlashMHA ModuleNotFoundError: No module named 'flash_attn'

Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted.

mpirun detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[64587,1],0] Exit code: 1

I tried to install flash_attn module, and it failed. Because the cuda and pytorch version that flash_atten requires exceed the required version for bevfusion.

Li-Whasaka avatar Aug 16 '23 14:08 Li-Whasaka

pip install flash-attn==0.2.0 easy fix

quantumdotsss avatar Aug 16 '23 18:08 quantumdotsss

pip install flash-attn==0.2.0 easy fix

Thank you for your reply, but I have tried it before and it doesn't work。 It may still be due to the version I mentioned above

ERROR: Command errored out with exit status 1: command: /home/lwx/anaconda3/envs/mitbevfusion/bin/python -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-2fqtkd28/flash-attn_be277a22af024b49bf1faa696bb97f10/setup.py'"'"'; file='"'"'/tmp/pip-install-2fqtkd28/flash-attn_be277a22af024b49bf1faa696bb97f10/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-qmma7sl8 cwd: /tmp/pip-install-2fqtkd28/flash-attn_be277a22af024b49bf1faa696bb97f10/ Complete output (20 lines): Traceback (most recent call last): File "", line 1, in File "/tmp/pip-install-2fqtkd28/flash-attn_be277a22af024b49bf1faa696bb97f10/setup.py", line 108, in _, bare_metal_major, _ = get_cuda_bare_metal_version(CUDA_HOME) File "/tmp/pip-install-2fqtkd28/flash-attn_be277a22af024b49bf1faa696bb97f10/setup.py", line 23, in get_cuda_bare_metal_version raw_output = subprocess.check_output([cuda_dir + "/bin/nvcc", "-V"], universal_newlines=True) File "/home/lwx/anaconda3/envs/mitbevfusion/lib/python3.8/subprocess.py", line 415, in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, File "/home/lwx/anaconda3/envs/mitbevfusion/lib/python3.8/subprocess.py", line 493, in run with Popen(*popenargs, **kwargs) as process: File "/home/lwx/anaconda3/envs/mitbevfusion/lib/python3.8/subprocess.py", line 858, in init self._execute_child(args, executable, preexec_fn, close_fds, File "/home/lwx/anaconda3/envs/mitbevfusion/lib/python3.8/subprocess.py", line 1720, in _execute_child raise child_exception_type(errno_num, err_msg, err_filename) FileNotFoundError: [Errno 2] No such file or directory: ':/usr/local/cuda-11.7/bin/nvcc:/usr/local/cuda-11.7/bin/nvcc/bin/nvcc'

torch.version = 1.10.0+cu111


Li-Whasaka avatar Aug 17 '23 03:08 Li-Whasaka

install and then re-compile

python setup.py develop

liwenxiang12 @.***> 于2023年8月16日周三 20:42写道:

pip install flash-attn==0.2.0 easy fix

Thank you for your reply, but I have tried it before and it doesn't work

— Reply to this email directly, view it on GitHub https://github.com/mit-han-lab/bevfusion/issues/492#issuecomment-1681565091, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALBJQH6WEDM27G3TBLMFJADXVWHLLANCNFSM6AAAAAA3SUIKQA . You are receiving this because you commented.Message ID: @.***>

quantumdotsss avatar Aug 17 '23 08:08 quantumdotsss

Just comment the code from flash_attn.flash_attention import FlashMHA and the code from .radar_encoder import *. Because they are used for radar. Hope this can help you.

wyf0414 avatar Nov 28 '23 06:11 wyf0414

pip install flash-attn==0.2.0 easy fix

I have met this problem, and your reply helped me. Thanks!

nowayhere1 avatar Mar 19 '24 07:03 nowayhere1

Try this.

/bevfusion/mmdet3d/models/backbones/radar_encoder.py :

from flash_attn.flash_attention import FlashMHA → from flash_attn.modules.mha import MHA

I think this error is caused by the difference between the version 1.0.9 and 2.0.0 of flash-attention.

hsingyu-chou avatar Mar 28 '24 20:03 hsingyu-chou