Langchain-Chatchat
Langchain-Chatchat copied to clipboard
Agent 不会调用工具
问题描述 / Problem Description
Agent chat 的时候知道该用什么工具,但是并不会调用工具
调用记录:
环境信息 / Environment Information
- langchain-ChatGLM 版本/commit 号:2.10
- 使用的模型(ChatGLM2-6B / Qwen-7B 等):ChatGLM3-6B
- 使用的 Embedding 模型(moka-ai/m3e-base 等):bge-large-zh-v1.5
- 使用的向量库类型 (faiss / milvus / pg_vector 等):pg_vector
我看有人提出这样的 issue,但是没有解决方法,有人有经验么
Hey @xiaohundun, long time no see! Hope you haven't been too swamped with those tricky issues. Let's crack this one together! 🛠️
Based on the information you've provided, it seems like the issue might be related to the tool invocation logic in your application. The StructuredChatOutputParserWithRetries class in the Langchain-Chatchat codebase is responsible for parsing the output from the language model and identifying tool calls. However, the actual invocation of the tool based on the parsed action is not shown in the provided code snippet.
Here are some steps you can take to troubleshoot and potentially resolve the issue:
-
Ensure that the tool invocation logic is correctly implemented and integrated within the agent's workflow. This typically involves mapping the action name to a specific tool or function call within your application, passing the extracted parameters to this tool or function, and handling the tool's response and integrating it back into the chat session. You can refer to the conceptual example provided in the context for more details.
-
Verify that the
chatglm3-6bmodel and its tokenizer are correctly placed in the directory specified byMODEL_ROOT_PATHin yourmodel_config.py.examplefile. Also, check the integrity of thechatglm3-6bmodel files, especially the tokenizer model file, to ensure they are not corrupted. Re-download or obtain a fresh copy if necessary. -
Ensure that the custom tool is included in the
toolssequence when initializingStructuredGLM3ChatAgentin theinitialize_glm3_agentfunction. Also, check that the tool's name is correctly recognized in theparsemethod of theStructuredChatOutputParserWithRetriesclass. -
Remove the OpenAI-related model configurations and only keep the
chatglmconfigurations inmodel_config.py. Also, try changing the port number in bothserver_config.pyandmodel_config.pyfrom8888to7777, and if that doesn't work, change it back to8888.
If these steps do not resolve the issue, it may be necessary to provide more information or seek further assistance from the community or the maintainers of the Langchain-Chatchat repository.
Here are some relevant issues that might help you:
- 调用自定义Agent,查询天气,没有任何返回
- [BUG] RuntimeError: Internal: could not parse ModelProto from chatglm3-6b/tokenizer.model
- 使用agent_chat调自定义工具,无法调到自定义工具
- [BUG] 启动无报错,设置了chatglm模型,但还是调用chatgpt,导致提问无响应
You can also refer to the source code of the StructuredChatOutputParserWithRetries and StructuredGLM3ChatAgent classes in the Langchain-Chatchat codebase for more insights:
I hope this helps! If you have any more questions or need further clarification, feel free to ask.
Sources
- 调用自定义Agent,查询天气,没有任何返回
- [BUG] RuntimeError: Internal: could not parse ModelProto from chatglm3-6b/tokenizer.model
- 使用agent_chat调自定义工具,无法调到自定义工具
- [BUG] 启动无报错,设置了chatglm模型,但还是调用chatgpt,导致提问无响应
- server/agent/custom_agent/ChatGLM3Agent.py
- knowledge_base/samples/content/test_files/langchain-ChatGLM_open.jsonl
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
同样的问题,不清楚还有没有在维护这个。
同问,急!!!
cy
0.3.x会重新做,这个agent太老了,我是在写这个部分的
0.3.x会重新做,这个agent太老了,我是在写这个部分的
大概什么时候会出0.3.x