InternVL icon indicating copy to clipboard operation
InternVL copied to clipboard

windows :ImportError: flash_attn is not installed

Open yunzhichen opened this issue 1 year ago • 1 comments

windows 下使用demo程序,本地加载模型库,运行一直报错: raise ImportError('flash_attn is not installed.') ImportError: flash_attn is not installed. 通过 pip install flash_attn 一直无法安装成功 请问如何解决?

yunzhichen avatar May 22 '24 13:05 yunzhichen

For flash attention in windows, please follow the Instruction in https://github.com/Dao-AILab/flash-attention. It might be tough to run with flash attention in windows.

htian01 avatar May 23 '24 08:05 htian01

可以不使用flash attention来运行模型,如果是InternVL-Chat-V1-5这个模型的话,在下载下来的模型文件里,把config.json中的"attn_implementation": "flash_attention_2"改成"eager",就可以关掉flash attention。

image

czczup avatar May 30 '24 13:05 czczup