dify icon indicating copy to clipboard operation
dify copied to clipboard

The streaming responses of the chat assistant's underlying orchestration have become blocked after the version upgrade.

Open starweavingdream opened this issue 10 months ago • 7 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

0.15.3

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Hello author, I upgraded from version 0.13.2 to 0.15.3, and after the upgrade, my previously orchestrated workflow (Chat Assistant - Basic Orchestration) can no longer sustain streaming responses properly. Specifically, during responses, there is only a single <message> tag containing all the content, whereas normally there should be multiple <message> tags. I’ve tried debugging with breakpoints but couldn’t pinpoint the issue—it feels like a potential bug. However, when I create a new workflow from scratch, streaming responses work correctly. Could this be due to compatibility issues with the upgrade? What should I do to resolve this?

As shown in the screenshot below, there is only a single message event.

Image

The screenshot below shows my workflow configuration. Please note the text highlighted in red—those nodes represent the most simplified test setup I used.

Image

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

starweavingdream avatar Feb 18 '25 09:02 starweavingdream

I found a relevant discussion that might be related to your issue. There is an open discussion about the /chat-messages API responding slowly with data in streaming mode [1]. Although it doesn't directly address the blocking issue you're experiencing, it might provide some insights or similar experiences from other users.

Regarding your specific issue, there was a fix implemented in version 0.15.2 for a problem where adding an If-Else node before an LLM node broke streaming output, causing streaming responses to revert to blocking responses [2]. This might be related to the issue you're facing if your workflow includes such nodes.

Since creating a new workflow resolves the issue, it could indeed be a compatibility problem with the upgrade. You might want to compare the configurations and nodes between your old and new workflows to identify any discrepancies. Additionally, ensure that any custom configurations or nodes in your old workflow are compatible with the new version.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Feb 18 '25 09:02 dosubot[bot]

@crazywoola Hello, after multiple tests, I found that the problem classification node located before the LLM node will trigger this issue when the number of categories exceeds one.

starweavingdream avatar Feb 19 '25 04:02 starweavingdream

Brother, has this problem been solved? I directly set up a large model input-output workflow, which only has one message and the answer is empty

ruanzhijian avatar Feb 25 '25 07:02 ruanzhijian

@ruanzhijian I'm sorry, I haven't managed to solve it either, and I'm also waiting for an official fix, as I'm not a professional Python developer. @laipz8200 Hello, I noticed this issue has been assigned to you. I attempted to upgrade to version V1.0.0, but the problem still persists. If you’ve seen this message, could you share your thoughts on how to resolve it? Or have you been able to identify the root cause of the issue?

starweavingdream avatar Mar 04 '25 07:03 starweavingdream

Hi @starweavingdream. When the variables in the Answer Node are placed after a branch node, streaming will be blocked. Branch nodes include if-else, classifier, failed-branch, etc. Could you share a minimal Workflow that can reproduce this issue?

laipz8200 avatar Mar 05 '25 07:03 laipz8200

@laipz8200 Hello, this is a minimal workflow example I prepared. Since GitHub cannot upload .yml files, I changed the extension from .yml to .txt. Please change it back to .yml before importing the DSL,The sample also reproduces the issue where the flow was terminated and forcibly converted to a blocking response on https://cloud.dify.ai.

AI问答助手.txt

starweavingdream avatar Mar 10 '25 01:03 starweavingdream

@laipz8200 Hello, may I ask if this issue can be fixed based on the files above?

starweavingdream avatar Mar 17 '25 01:03 starweavingdream

@starweavingdream Hi there,

I cannot reproduce your issue. Check the following video:

https://github.com/user-attachments/assets/63acdfdf-e5f0-4298-be5e-5205699c14f4

laipz8200 avatar Mar 17 '25 08:03 laipz8200