dify icon indicating copy to clipboard operation
dify copied to clipboard

The streaming responses of the chat assistant's underlying orchestration have become blocked after the version upgrade.

Open starweavingdream opened this issue 9 months ago • 2 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.1.3

Cloud or Self Hosted

Self Hosted (Docker), Cloud

Steps to reproduce

Hello, author. After upgrading from version 0.13.2 to 1.1.3, my previously orchestrated Chatflows can no longer properly support streaming responses. Specifically, during responses, there is now only a single tag instead of the usual multiple tags. I tried debugging using breakpoints but couldn't pinpoint the issue—it feels like a potential bug. I've successfully reproduced this on the official cloud.dify.ai platform as well. Below are my validation video and DSL.

I previously reported this issue as #13923, but user @laipz8200 couldn't reproduce it. This might have been due to an incomplete DSL in my initial report. This time, I will provide the complete DSL file.

If relevant screenshots are needed, please refer to #13923.

In the video, two nodes were tested sequentially: the IF-ELSE node was attempted first, followed by the Question Classification node.

https://github.com/user-attachments/assets/a45622d7-9bd7-44aa-af0d-38448dcaa751

测试流转阻塞模式.txt

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

starweavingdream avatar Mar 26 '25 07:03 starweavingdream

I found a similar open issue that might be relevant to your problem. The issue suggests that the problem may be triggered by a problem classification node located before the LLM node when the number of categories exceeds one. Additionally, placing variables in the Answer Node after a branch node (such as if-else, classifier, failed-branch, etc.) can block streaming. It is recommended to review the workflow configuration to ensure compatibility with the new version and avoid configurations that may lead to blocking responses [1].

You might want to check if your workflow configuration aligns with these suggestions.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 26 '25 07:03 dosubot[bot]

@laipz8200 @crazywoola Hello, could you please confirm if this issue can be reproduced? Also, do you have any related troubleshooting insights or solutions?

starweavingdream avatar Apr 01 '25 08:04 starweavingdream

@laipz8200 @crazywoola Hello, I have tested version 1.3.0 and the issue still exists.

starweavingdream avatar Apr 14 '25 06:04 starweavingdream

We haven't released 1.3.0 yet. :)

crazywoola avatar Apr 14 '25 08:04 crazywoola

@crazywoola Sorry, I made a mistake. It's 1.2.0 😂

starweavingdream avatar Apr 16 '25 06:04 starweavingdream