dify icon indicating copy to clipboard operation
dify copied to clipboard

Chat room streaming response interrupted

Open starweavingdream opened this issue 1 year ago • 2 comments

Self Checks

  • [X] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [X] I have searched for existing issues search for existing issues, including closed ones.
  • [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [X] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [X] Please do not modify this template :) and fill in all the required fields.

Dify version

0.8.3

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Calling the general chat assistant through the server-api for streaming response, when the number of consecutive questions reaches about 30, the flow will be interrupted. After the flow is interrupted, blank records will appear in the message table.

In the first black console interface, I marked the relevant logs in Chinese. When performing a streaming response, it will output 执行到这就代表了返回了数据: followed by the specific streaming data; and the result of the model call is marked as 模型流式调用响应结果:. According to the relevant logs in the figure, the model correctly returned the result data, but it was abnormally interrupted during the streaming response, but there is no corresponding error log.

97E8453F-615B-4b34-AC65-3ECAD97B8ACC

✔️ Expected Behavior

If the streaming is interrupted abnormally, there should be a corresponding exception log

❌ Actual Behavior

No response

starweavingdream avatar Oct 10 '24 07:10 starweavingdream

A similar issue was discussed in a closed thread where the chat response was interrupted. The solution involved modifying the default template in expert mode to:

{{#histories#}}
Human: {{#query#}}
Assistant:

This change allowed the chat to function correctly [1].

For your specific issue, the interruption might be due to error handling or streaming logic problems. Ensure exceptions are caught and logged properly, and verify that message table logging is triggered even in case of errors. The absence of error logs could be due to unhandled exceptions during the streaming process [2].

To continue talking to Dosu, mention @dosu.

dosubot[bot] avatar Oct 10 '24 07:10 dosubot[bot]

This didn't solve my problem, where my problem was occurring was in the flow being interrupted when responding to the front end, not in the model not answering.

starweavingdream avatar Oct 10 '24 08:10 starweavingdream

I found that the parsing failed because the api responded to event: ping

starweavingdream avatar Oct 11 '24 01:10 starweavingdream