VGen icon indicating copy to clipboard operation
VGen copied to clipboard

flash-attn install failed

Open BaiMoHan opened this issue 1 year ago • 5 comments

RuntimeError:
      The detected CUDA version (12.2) mismatches the version that was used to compile
      PyTorch (11.3). Please make sure to use the same CUDA versions.

      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for flash-attn
  Running setup.py clean for flash-attn
  Building wheel for future (setup.py) ... done
  Created wheel for future: filename=future-0.18.3-py3-none-any.whl size=492024 sha256=1db8ebe22124761ed948511526efe5885139151d45616f670daa921e552c3afe
  Stored in directory: /home/ubuntu/.cache/pip/wheels/a0/0b/ee/e6994fadb42c1354dcccb139b0bf2795271bddfe6253ccdf11
Successfully built easydict fairscale future
Failed to build flash-attn
ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects

The torch version in the requirements.txt file may need to be adjusted.

BaiMoHan avatar Dec 19 '23 04:12 BaiMoHan

Hi, it seems to be an issue with the version of CUDA, our version is 11.4.

Steven-SWZhang avatar Dec 19 '23 06:12 Steven-SWZhang

this version code doesn't use flash-attn module actually. In my case, i don't install flash-attn module and delete all import line that contain flash-attn module.

8414sys avatar Dec 19 '23 11:12 8414sys

this version code doesn't use flash-attn module actually. In my case, i don't install flash-attn module and delete all import line that contain flash-attn module.

python inference.py --cfg configs/i2vgen_xl_infer.yaml  test_list_path data/test_list_for_i2vgen.txt test_model models/i2vgen_xl_00854500.pth

I used this command, and it need the flash-attn.

BaiMoHan avatar Dec 20 '23 10:12 BaiMoHan

I used this command, and it need the flash-attn.

I also use that command. Go to the files and delete below code files ./i2vgen-xl/tools/modules/unet/unet_i2vgen.py ./i2vgen-xl/tools/modules/unet/unet_t2v.py ./i2vgen-xl/tools/modules/unet/util.py

code to delete

from .mha_flash import FlashAttentionBlock

Then it will be work without flash-attn

8414sys avatar Dec 20 '23 10:12 8414sys

I used this command, and it need the flash-attn.

I also use that command. Go to the files and delete below code files ./i2vgen-xl/tools/modules/unet/unet_i2vgen.py ./i2vgen-xl/tools/modules/unet/unet_t2v.py ./i2vgen-xl/tools/modules/unet/util.py

code to delete

from .mha_flash import FlashAttentionBlock

Then it will be work without flash-attn

This worked for me. Thank you! But next, I got this problem: no kernel image is available for execution on the device.

BaiMoHan avatar Dec 20 '23 11:12 BaiMoHan