Charimanhua

Results 7 comments of Charimanhua

> I wanna know if you solve this problem because I encounter the same problem with you even if the model I use is different with you yes I have...

> Hello, [@mxchinegod](https://github.com/mxchinegod). Thank you for pointing this issue! We've released the latest version 1.3.4 on PyPi, where this issue has been fixed. updated FlagEmbedding==1.3.4, still have this error.

请问你们有遇到 ModuleNotFoundError: No module named 'flash_attn.flash_attention' 报错么?请问你们是如何解决的?谢谢!

微调后,调用微调后的模型报错: ![401730889265_ pic](https://github.com/user-attachments/assets/e86640ea-da3f-49e3-b85e-b8aef8d77071) 如何解决?模型内容如下: ![411730889276_ pic](https://github.com/user-attachments/assets/98aafd05-e359-4ae6-b87a-1dae77bcf595)

请问merge lora应该在哪一步操作呀?我不是很懂,谢谢!

请问在微调got-ocr2.0后,使用保存的微调模型时,报错model_type的问题,该如何解决? ![image](https://github.com/user-attachments/assets/56bd84aa-8576-406a-8a63-d759ba6136d0) ![image](https://github.com/user-attachments/assets/2f410c84-f416-462e-933d-9089ab47bc9f)

> > 请问在微调got-ocr2.0后,使用保存的微调模型时,报错model_type的问题,该如何解决? > > 你需要merge lora. 才会有config.json文件 已解决,谢谢! cd 进微调模型目录下,执行: swift merge-lora --ckpt_dir xxx