dify icon indicating copy to clipboard operation
dify copied to clipboard

When using tool to return images data, it will be extremely slow

Open QunBB opened this issue 9 months ago • 1 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.1.3

Cloud or Self Hosted

Self Hosted (Source)

Steps to reproduce

Create a chatflow app, add the ComfyUI tool node, and finally return the image.

Some analysis about my test

the problem lies in the function api/core/plugin/manager/base.py/_stream_request and something wrong in plugin_daemon sse. _stream_request receive the first response very slowly with stream mode, and i try to change it without stream mode, then it is behaving normally.

✔️ Expected Behavior

The plugin_daemon log: [GIN] 2025/03/27 - 09:07:01 | 200 | 4.998979025s | 192.168.xxx.xxx | POST "/plugin/fe02ab38-51c9-49e1-8e0f-717efab59378/dispatch/tool/invoke"

The app should take about 5-6s to complete.

❌ Actual Behavior

Image

QunBB avatar Mar 27 '25 09:03 QunBB

There are a few similar discussions regarding slow response times when using streaming mode in Dify:

  1. Using post to call dify chatflow's API (enable streaming), the response is very slow [1].
  2. The API service that uses DIFY uses the streaming stream, but the result is that it waits for a long time to return at the same time, what is the reason for this? [2].
  3. When publishing Api, ''/chat-messages'' api responds slowly with data in streaming mode [3].

These discussions are unsolved, but they might provide some insights or potential workarounds.

Regarding your issue, the _stream_request function in api/core/plugin/manager/base.py is designed to handle streaming requests by processing the response incrementally. This can sometimes lead to slower initial response times compared to non-streaming mode, where the entire response is processed at once [4].

To potentially improve performance, you might consider adjusting configuration settings related to timeouts in your .env file, such as HTTP_REQUEST_MAX_READ_TIMEOUT and HTTP_REQUEST_MAX_WRITE_TIMEOUT, to better handle streaming requests [5].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 27 '25 09:03 dosubot[bot]