Results 8 comments of XipengY

@PlutoQyl Thanks, It works for me!

The version of v1.0.2 has no device parameter. https://github.com/HazyResearch/flash-attention/blob/v1.0.2/flash_attn/flash_attention.py#L21 But v0.2.8 has device parameter. https://github.com/HazyResearch/flash-attention/blob/v0.2.8/flash_attn/flash_attention.py#L21

Thanks for your replay! I found torch==2.0.0 need CUDA>=11.7, we build torch by source? I use 'pip install -e .' the default version of flash-attn is 1.0.1, but I also...

I use 'conda install pytorch torchvision cudatoolkit=11.7 -c pytorch', and pip install flash-attn==0.2.8, solved the environment.

try torch version 2.0.0?

@Richar-Du Maybe you can checkout your NVCC version, you'd better use NVCC>11.7, hope to help you!

Hi, we have uploaded this script in folder of TOOLS.