Good Wood
Good Wood
> I found similar closed issues that might help you address the problem of truncated LLM response text in Dify: > > 1. One issue suggested checking if the maximum...
It can only stay 240s  I have tried many times~! @crazywoola
The root cause of the issue wasn't a problem with the socket itself, but rather that the upstream plugin imposed a 240-second timeout on the response stream.
@liujuping @1ncounter
停止维护了吗
Your model should support one of these features: 
> > I have already done this configuration in the model interface. @doit-5618  check the api: /console/api/workspaces/current/models/model-types/llm
lerna 控制,然后统一 build