InternVL
InternVL copied to clipboard
windows :ImportError: flash_attn is not installed
windows 下使用demo程序,本地加载模型库,运行一直报错: raise ImportError('flash_attn is not installed.') ImportError: flash_attn is not installed. 通过 pip install flash_attn 一直无法安装成功 请问如何解决?
For flash attention in windows, please follow the Instruction in https://github.com/Dao-AILab/flash-attention. It might be tough to run with flash attention in windows.
可以不使用flash attention来运行模型,如果是InternVL-Chat-V1-5这个模型的话,在下载下来的模型文件里,把config.json中的"attn_implementation": "flash_attention_2"改成"eager",就可以关掉flash attention。