FlagEmbedding icon indicating copy to clipboard operation
FlagEmbedding copied to clipboard

关于llm_instruction_reranker的save方法的问题新

Open FleetCommander004 opened this issue 1 year ago • 1 comments

非常感谢贵公司的FlagEmbedding安装包。我在用llm_instruction_reranker的时候遇到一个问题。 问题在于下面这个方法,他来自:https://github.com/FlagOpen/FlagEmbedding/blob/13da7435aba2c4cfbbd7caa4c595fe4862f6ba19/FlagEmbedding/llm_reranker/finetune_for_instruction/trainer.py#L9C2-L29C1 我按照https://github.com/FlagOpen/FlagEmbedding/issues/749 里面的内容注释了这三行代码: if not self.use_lora: super()._save(output_dir, state_dict) return

但是在使用AutoModelForCausalLM加载训练好的模型的时候,仍然会报错说: Some weights of the model checkpoint at /models_hf/bge-reranker-sft/zhihu_qa_llm_1/ were not used when initializing GemmaForCausalLM: 截屏2024-09-04 20 01 13 不知道是哪里出了问题呢?

FleetCommander004 avatar Sep 04 '24 12:09 FleetCommander004

可以直接保存lora参数,后续合并模型就可以 合并模型的代码参考:

from FlagEmbedding.llm_reranker.merge import merge_llm
merge_llm('google/gemma-2b', 'lora_llm_output_path', 'merged_model_output_paths')

545999961 avatar Sep 05 '24 03:09 545999961