Flowise
Flowise copied to clipboard
[FEATURE] - Support streaming response over Sequential Agents
Describe the feature you'd like Sequential Agents to support streaming response from LLM (mainly Open AI)
Additional context As of now, I see Sequential Agents don't support streaming response from LLM. This reduce the usability of these agents, please fix it.
@HenryHengZJ 🙏
Thanks