Hongji Zhu
Hongji Zhu
还不太行,需要做些修改,我们两周内会发布V2.0微调代码。
@rover5056 MiniCPM-V 2.0 微调代码已集成至[swift](https://github.com/modelscope/swift/blob/main/docs/source/Multi-Modal/minicpm-v-2%E6%9C%80%E4%BD%B3%E5%AE%9E%E8%B7%B5.md),欢迎使用,有任何问题随时反馈~
Web demo代码已开放,[enjoy](https://github.com/OpenBMB/MiniCPM-V?tab=readme-ov-file#%E6%9C%AC%E5%9C%B0webui-demo%E9%83%A8%E7%BD%B2)
We are working on it
We will release the technical report in several weeks.
We are working on it,please stay tuned.
coming soon!
Which gpu type are you using,A100 or 3090 or something else?
Outmoded gpu should use fp16, try this: ```python model = AutoModel.from_pretrained('openbmb/MiniCPM-V-2', trust_remote_code=True) # For Nvidia GPUs support BF16 (like A100, H100, RTX3090) #model = model.to(device='cuda', dtype=torch.bfloat16) # For Nvidia GPUs...