Langchain-Chatchat
Langchain-Chatchat copied to clipboard
[BUG] langchain加载chatGLM3模型预测结果出现"<|user|>"标签,且自问自答,但fschat加载模型预测结果基本正常
关于“wiki中Q20: chatglm3-6b对话中出现"<|user|>"标签,且自问自答”, 现象如图。 正常结果是从startup.py里的create_model_worker_app函数中worker直接调用worker.model和worker.tokenizer测试的结果是不含user标签的,结果是正常的 异常结果是从项目chain文件夹中chains.llmchain_with_history代码和UI测试结果
langchain: 0.0.354 fschat: 0.2.35 使用的模型:ChatGLM3-6B经lora合并后模型 非 Docker 部署 其他环境默认 但是使用原ChatGLM3-6B未见异常结果!
🤖
Based on the issue you've described, it seems like the problem might be related to how the model is being loaded in Langchain-Chatchat, specifically when using the chains.llmchain_with_history code.
In the similar issue chatglm3-6b问题, the user was experiencing a similar problem where the model was talking to itself. The solution in that case was to adjust the way the model was being loaded and used.
It's possible that the issue you're experiencing is related to the way the ChatGLM3-6B model merged with lora is being loaded and used in Langchain-Chatchat.
You mentioned that the issue does not occur when you're using the original ChatGLM3-6B model. This suggests that the problem might be specific to the merged model.
One possible solution could be to try loading and using the merged model in the same way as the original ChatGLM3-6B model. If the issue persists, it might be necessary to look into the specifics of how the merged model is structured and how it interacts with Langchain-Chatchat.
I hope this helps! If you have any further questions or if the issue persists, please provide more details so I can assist you better.
Sources
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
更新glm最新的hf文件
您好老师,能未问下怎么更新吗
huggingface有呀,直接git pull就能更新了
这个问题还是会出现啊,怎么解决呢?