streaming function call support for openai_api_compatible llm has a bug
Self Checks
- [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [X] I have searched for existing issues search for existing issues, including closed ones.
- [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [X] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [X] Please do not modify this template :) and fill in all the required fields.
Dify version
main branch
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
First, configure an llm in openai_api_compatible, supporting streaming function call; Secondly, use the llm in an agent, Then query the agent
✔️ Expected Behavior
The agent answer me correctly
❌ Actual Behavior
report 'Unterminated string starting at: line 1 column 2 (char 1)' on the webui
Hello @hh1562535601! I'm here to assist you with any bugs, questions, or contributions you have for the project. Let's work together to resolve the issue you're facing.
The error 'Unterminated string starting at: line 1 column 2 (char 1)' in the streaming function call support for openai_api_compatible LLM when used in an agent is likely caused by an issue in the _handle_generate_stream_response method. This error occurs when attempting to parse a JSON string from the streamed response, and the decoded_chunk is not a valid JSON string. This can happen if the response contains malformed JSON or incomplete data. The code attempts to handle this by yielding a result chunk with a finish reason of "Non-JSON encountered" and then breaking out of the loop [1].
To continue talking to Dosu, mention @dosu.
I change the code
diff --git a/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py b/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py
index 356ac56b1..632f18248 100644
--- a/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py
+++ b/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py
@@ -438,7 +438,7 @@ class OAIAPICompatLargeLanguageModel(_CommonOaiApiCompat, LargeLanguageModel):
for new_tool_call in new_tool_calls:
# get tool call
- tool_call = get_tool_call(new_tool_call.function.name)
+ tool_call = get_tool_call(new_tool_call.id)
# update tool call
if new_tool_call.id:
tool_call.id = new_tool_call.id
then it works @dosu
Hi, @hh1562535601. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.
Issue Summary
- A bug was reported in the streaming function call support for the
openai_api_compatibleLLM in a self-hosted environment. - The error involved an unterminated string, likely due to malformed JSON in the streamed response.
- You resolved the issue by modifying the code to use
new_tool_call.idinstead ofnew_tool_call.function.name.
Next Steps
- Please confirm if this issue is still relevant to the latest version of the Dify repository. If so, you can keep the discussion open by commenting on the issue.
- Otherwise, the issue will be automatically closed in 15 days.
Thank you for your understanding and contribution!