Chinese-CLIP icon indicating copy to clipboard operation
Chinese-CLIP copied to clipboard

报错:ModuleNotFoundError: No module named 'flash_attn.flash_attention'

Open Charimanhua opened this issue 1 year ago • 3 comments

运行示例代码: `import torch from PIL import Image

import cn_clip.clip as clip from cn_clip.clip import load_from_name, available_models print("Available models:", available_models())

Available models: ['ViT-B-16', 'ViT-L-14', 'ViT-L-14-336', 'ViT-H-14', 'RN50']

device = "cuda" if torch.cuda.is_available() else "cpu" model, preprocess = load_from_name("ViT-B-16", device=device, download_root='./') model.eval() image = preprocess(Image.open("examples/pokemon.jpeg")).unsqueeze(0).to(device) text = clip.tokenize(["杰尼龟", "妙蛙种子", "小火龙", "皮卡丘"]).to(device)

with torch.no_grad(): image_features = model.encode_image(image) text_features = model.encode_text(text) # 对特征进行归一化,请使用归一化后的图文特征用于下游任务 image_features /= image_features.norm(dim=-1, keepdim=True) text_features /= text_features.norm(dim=-1, keepdim=True)

logits_per_image, logits_per_text = model.get_similarity(image, text)
probs = logits_per_image.softmax(dim=-1).cpu().numpy()

print("Label probs:", probs) # [[1.268734e-03 5.436878e-02 6.795761e-04 9.436829e-01]]时报错:--------------------------------------------------------------------------- ModuleNotFoundError Traceback (most recent call last) Cell In[2], line 4 2 from PIL import Image 3 import torch.nn.functional as F ----> 4 import cn_clip.clip as clip 5 from cn_clip.clip import load_from_name 7 # 加载Chinese-CLIP模型和预处理器

File ~/HuahaiRan/Chinese-CLIP-master/cn_clip/clip/init.py:4 1 from .bert_tokenizer import FullTokenizer 3 _tokenizer = FullTokenizer() ----> 4 from .model import convert_state_dict 5 from .utils import load_from_name, available_models, tokenize, image_transform, load

File ~/HuahaiRan/Chinese-CLIP-master/cn_clip/clip/model.py:16 14 import importlib.util 15 if importlib.util.find_spec('flash_attn'): ---> 16 FlashMHA = importlib.import_module('flash_attn.flash_attention').FlashMHA 18 from cn_clip.clip import _tokenizer 19 from cn_clip.clip.configuration_bert import BertConfig

File ~/anaconda3/envs/meme/lib/python3.10/importlib/init.py:126, in import_module(name, package) 124 break 125 level += 1 --> 126 return _bootstrap._gcd_import(name[level:], package, level)

ModuleNotFoundError: No module named 'flash_attn.flash_attention'`

但查看环境,

image

image

是安装了对应库的,请问如何解决?谢谢!

Charimanhua avatar Nov 27 '24 14:11 Charimanhua

同样的问题

sunzhaoyang1 avatar Dec 25 '24 09:12 sunzhaoyang1

同样问题+1 所以是什么问题呢?

Bernice123 avatar Jan 02 '25 04:01 Bernice123

应该是flash_attn版本太高了,我之前安装2.7.1也有这个问题,然后把它降级到1.0.9就好了,貌似2.0版本的flash_attn没有flash_attention这个子模块了

tanyz0208 avatar Jan 19 '25 04:01 tanyz0208

cn_clip库更新了,安装的时候指定一下版本,目前在我的环境下测试cn_clip-1.5.1没问题,1.6.0会报这个问题

orangerfun avatar Sep 02 '25 07:09 orangerfun