Langchain-Chatchat
Langchain-Chatchat copied to clipboard
[BUG] 简洁阐述问题 /怎么加载ChatGLM-6B微调出来的模型
问题描述 /怎么加载ChatGLM-6B微调出来的模型
你好,我现在通过ChatGLM-6B的chatglm-6b模型在加上自己的数据微调出了一个adgen-chatglm-6b-pt-45,我现在想吧这个也使用到[Langchain-Chatchat上来,请问我需要这么修改
你好,@LGghost!我是 Dosu,一个友好的机器人,随时准备帮助你解决问题、回答问题,甚至成为一个贡献者。我在这里提供帮助,同时等待人类维护者的回应。现在,我将开始查看你的问题,并很快给你一个完整的答复。
好的,请问我该这么把ChatGLM-6B的chatglm-6b模型在这个上面使用
自己微调的数据
To use your fine-tuned adgen-chatglm-6b-pt-45 model with Langchain-Chatchat, follow these steps:
- Update Model Configuration: Add your model to
configs/model_config.pyunderLLM_MODELSand specify its path or identifier inMODEL_PATHlike so:
LLM_MODELS = ["chatglm3-6b", "zhipu-api", "openai-api", "adgen-chatglm-6b-pt-45"]
MODEL_PATH = {
"llm_model": {
"adgen-chatglm-6b-pt-45": "path_or_identifier_for_adgen-chatglm-6b-pt-45",
},
}
Replace "path_or_identifier_for_adgen-chatglm-6b-pt-45" with the actual path or identifier of your model.
-
Manage Models via Endpoints: Use the provided endpoints (
/llm_model/list_config_models,/llm_model/change,/llm_model/get_model_config) to manage and switch to your model. Ensure it's listed and properly configured before making chat requests. -
Integration and Testing: After switching to your model, test the chat functionality to ensure it works as expected. Adjust configurations as needed based on the test outcomes.
-
Deployment: Once satisfied with the testing, proceed to deploy your changes, making sure all model dependencies are correctly set up.
This guide assumes familiarity with Langchain-Chatchat's structure and model management. For detailed implementation, refer to the Langchain-Chatchat repository.
Details
Sources
- 在线embedding的使用
- 微调模型的导入
- 使用模型chatglm3-6b,如何在langchain-chatchat交互中调用到chatglm3-6b自定义的function,请提供实例
- 使用模型chatglm3-6b,如何在langchain-chatchat交互中调用到chatglm3-6b自定义的function,请提供实例
- 使用模型chatglm3-6b,如何在langchain-chatchat交互中调用到chatglm3-6b自定义的工具(function calling),请提供实例
- V0.2.4以上版本如何加载ptunig-v2的微调模型?
- [FEATURE] 如何接入CustomLLM?
- [询问帖] 我该如何在该项目使用ChatGLM-LoRA呢
- server/api.py
- README_en.md
- configs/model_config.py.example
- server/llm_api.py
- tests/test_online_api.py
- server/agent/custom_agent/ChatGLM3Agent.py
- knowledge_base/samples/content/test_files/langchain-ChatGLM_closed.jsonl
- knowledge_base/samples/content/test_files/langchain-ChatGLM_open.jsonl
About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
To continue the conversation, mention @dosu.
查看wiki