Charimanhua
Charimanhua
> I wanna know if you solve this problem because I encounter the same problem with you even if the model I use is different with you yes I have...
> Hello, [@mxchinegod](https://github.com/mxchinegod). Thank you for pointing this issue! We've released the latest version 1.3.4 on PyPi, where this issue has been fixed. updated FlagEmbedding==1.3.4, still have this error.
请问你们有遇到 ModuleNotFoundError: No module named 'flash_attn.flash_attention' 报错么?请问你们是如何解决的?谢谢!
微调后,调用微调后的模型报错:  如何解决?模型内容如下: 
请问merge lora应该在哪一步操作呀?我不是很懂,谢谢!
请问在微调got-ocr2.0后,使用保存的微调模型时,报错model_type的问题,该如何解决?  
> > 请问在微调got-ocr2.0后,使用保存的微调模型时,报错model_type的问题,该如何解决? > > 你需要merge lora. 才会有config.json文件 已解决,谢谢! cd 进微调模型目录下,执行: swift merge-lora --ckpt_dir xxx