dify icon indicating copy to clipboard operation
dify copied to clipboard

LLM node not streaming

Open DavideDelbianco opened this issue 9 months ago • 4 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.1.3

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

I've seen the issue #13626 where the bug is supposedly fixed I've seen the issue #15700 where the solution seems to be to remove the IF/ELSE node before the Question classifier node I've seen the issue #15700 where the bug seems to be relate to crossing Question Classifier nodes I've seen the issue #16882 where the solution is to place the streaming LLM+Answer nodes inside a Loop Node

In the video you can see me:

  1. Use the streaming node inside the Loop node (it works)
  2. Use the same flow with the streaming node outside the Loop node (it does not work)
  3. Remove the branch coming from a non executed branch of the first question classifier (it works even if there's an IF/ELSE node before the second Classifier node)
  4. Execute a simple flow that seems to work in a freshly created chatflow
  5. Tear down my flow (upgraded from previous versions) to make it exactly like the working example (it does not work)
  6. Demonstrate that adding an IF/ELSE node before the Classifer nodes that merge into the LLM node breaks the streaming output.

It seems to me that there are multiple issue related to the Question Classifier node + IF/ELSE node + merging branches before a LLM node, but only if one of the branches remains not executed, and possibly even something broken in old DSL because of point 4 and 5 yelding different results

https://drive.google.com/file/d/1uwQ142aKNAQKbwCex2dDgFQ8-G_hmNUS/view?usp=drive_link

✔️ Expected Behavior

LLM nodes should stream the output independendantly of what happened in previous branches with Classier nodes

❌ Actual Behavior

LLM node does not stream the output in more than one condition when a Classifier node was previously used

DavideDelbianco avatar Mar 28 '25 17:03 DavideDelbianco

There are a few relevant discussions and issues related to the problem you're experiencing with the LLM node not streaming output when used with Classifier and IF/ELSE nodes:

  1. Discussion on Slow Response with Streaming: There is an ongoing discussion about slow response times when using the Dify chatflow's API with streaming enabled [1].

  2. Issue with Streaming Output Failing: A closed issue addressed a problem where streaming output failed when LLM nodes enabled error handling. This was resolved by a pull request that fixed the stream output error [2].

  3. Classifier Node and IF/ELSE Node Issues: There is a known limitation in version v0.15.3 where the parallel chatflow gets stuck at the Question Classifier node when there is a connection between the Variable Assigner and If/Else node. This is expected to be fixed in the upcoming v1.0.0 release [3].

  4. Nested IF/ELSE Workflow Issue: The LLM node not streaming output in certain conditions, especially with nested IF/ELSE nodes, was due to an issue in the _fetch_node_ids_in_reachable_branch method. This has been addressed in a recent update through PR #12105 in Dify version 0.14.2 [4].

It seems like your issue might be related to these known problems, particularly with the handling of branches and the execution order in workflows involving Classifier and IF/ELSE nodes. You might want to check if your setup aligns with the solutions provided in these discussions and issues. If the problem persists, consider whether any recent updates or patches might address your specific scenario.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 28 '25 17:03 dosubot[bot]

我找到一个解决方法,把流输出有问题的节点,套在一个循环组件里面,只循环1次,在里面放入llm和回复组件,即可恢复正常的流输出~~~

Image

ft4710403 avatar Mar 29 '25 05:03 ft4710403

@ft4710403 it's among the tests that I've done, but the issue needs to be fixed 😊

DavideDelbianco avatar Mar 29 '25 10:03 DavideDelbianco

@DavideDelbianco I’ve tested using the streaming node inside a Loop node — it works only if there are no failed nodes attached to the streaming node. If there’s a failed node connected to the streaming node, streaming does not work as expected. Removing the failed node restores normal streaming behavior.

saurabhy27 avatar Apr 24 '25 07:04 saurabhy27

I want to verify if the mentioned PR has fixed this issue, but unfortunately I cannot replay the video. Could you provide something for me to verify whether this PR has fixed the issue?

Nov1c444 avatar May 13 '25 07:05 Nov1c444

I have restored the video,let me know if the link does not work

DavideDelbianco avatar May 13 '25 11:05 DavideDelbianco

It works, but the issue of streaming output when one node is connected to multiple condition nodes, which I mentioned above, has still not been resolved by the PR. I will do my best to solve this problem soon.

Nov1c444 avatar May 14 '25 01:05 Nov1c444

Is the PR a valid one? Does this issue still exist?

DavideDelbianco avatar Jun 26 '25 09:06 DavideDelbianco

Fixed by https://github.com/langgenius/dify/pull/20880

Is the PR a valid one? Does this issue still exist?

Nov1c444 avatar Jun 27 '25 05:06 Nov1c444

我找到一个解决方法,把流输出有问题的节点,套在一个循环组件里面,只循环1次,在里面放入llm和回复组件,即可恢复正常的流输出~~~

Image

Good!!!

LXXiaogege avatar Dec 12 '25 02:12 LXXiaogege