Redis Chat Memory - Message Store | Stream does not save data
Bug Description
As seen in the image, the connection of the components in the flow is like this. When I chat via playground with this connection, both the question asked by the user and the answer are recorded on the redis server. I can see the output via redis insight. However, when I try to talk to my backend application with flow id and the streamlit in front of it, the images are attached, Only the user's question is transferred to the redis server. Ai's answer is not available on the redis server.
I think the problem is due to the fact that I turned on the "stream" feature. When I have a chat without a stream, the answer that comes to my backend project from Ai is also recorded on the redis server.
Thank you.
Reproduction
def run_flow(message: str, endpoint: str, output_type: str = "chat", input_type: str = "chat", tweaks: Optional[dict] = None, api_key: Optional[str] = None, session_id: Optional[str] = None, stream: bool = False) -> Iterator[str] | dict: api_url = f"{BASE_API_URL}/api/v1/run/{endpoint}"
Expected behavior
In my backend python - streamlit project, in the example where I use the stream feature, after the user's question and the answer are completed with the stream flow, the full answer is written to the redis server with the tag ("ai").
This feature works correctly in Playground. It does not work remotely with flow id at the moment.
Who can help?
No response
Operating System
mac m2
Langflow Version
1.4.0 latest
Python Version
3.11
Screenshot
Flow File
No response
The issue occurs because the /api/v1/run/<flow_id>?stream=true endpoint is deprecated and does not support saving streamed AI responses to Redis. This is why only the user's question is saved when using the stream feature in your backend, while both question and answer are saved in the Playground, which uses a different endpoint for streaming responses reference.
To resolve this, switch to using the /build endpoint for streaming, which is the recommended approach and is used by the Playground. Here’s how you can implement this:
-
Send a POST request to
/api/v1/build/{flow_id}/flow?stream=truewith your input and session information:{ "inputs": { "input_value": "Your message", "session": "your-session-id" } }This will return a
job_idreference. -
Use the returned
job_idto stream results by making a GET request to/api/v1/build/{job_id}/events. This will provide you with real-time streaming events, including the AI's answer reference. -
Make sure your configuration sets
event_deliverytostreamingin yourbase.pyfile for true streaming responses reference. -
When parsing the streamed events, look for
end_vertexevents and extract the AI's answer fromevent_data.data.build_data.data.results.message.textto reconstruct the full response reference.
This approach should ensure that both the user's question and the AI's answer are saved to Redis when using the stream feature, matching the behavior you see in the Playground. If you continue to experience issues, check your backend's handling of the streamed events and verify that session information is correctly passed in your requests.
To reply, just mention my name @dosu.
How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other
Hi, I already mentioned that my issue is not related to the REDIS_URL connection, since the user messages are successfully written to Redis. The problem is this: when I run the same flow using the Langflow UI (Playground), both the user message and the agent's response are saved to Redis. However, when I run the same flow from the backend (e.g., via Streamlit), only the user message is stored — the agent response is not saved.
This suggests that the RedisChatMessageHistory object is properly initialized, but the agent's response might not be passed into memory.save_context() during execution.
So the issue is not a connection misconfiguration, but rather that the assistant's response is not being stored in memory. Could you help with this?
Your analysis is correct: the issue is not with your Redis connection, but with how agent responses are stored when using streaming from the backend.
When you use the Langflow Playground, both user and agent messages are saved to Redis because the Playground uses the /build endpoint for streaming, which is designed to handle and persist streamed responses. In contrast, if your backend is using the deprecated /api/v1/run/<flow_id>?stream=true endpoint, only the user's message is saved—agent responses are not persisted in Redis during streaming with this endpoint. This is a known limitation and matches your observation that turning off streaming results in both messages being saved, but enabling streaming does not save the agent's response in Redis from the backend (source).
To resolve this, update your backend to use the /api/v1/build/{flow_id}/flow?stream=true endpoint for initiating the flow. After you receive a job_id, stream the results from /api/v1/build/{job_id}/events. This approach matches the Playground's behavior and ensures both user and agent messages are saved to Redis during streaming.
Also, make sure your backend configuration sets event_delivery to streaming (either in your environment or base.py), and always include the correct session_id in your requests to maintain session context.
If you already use the /build endpoint and still see this issue, double-check that your backend code is correctly parsing the streamed events and that the agent's response is being passed to the memory storage logic after the stream completes. In Langflow, the agent's response is typically saved when the full response is available at the end of the stream, so ensure your backend triggers the save at that point.
If you need to persist tool responses or other custom data, you may need to explicitly call the message storage logic (such as astore_message) after the agent's response is fully received (source).
In summary: switch to the /build streaming API, ensure your backend parses the streamed agent response, and confirm that the save-to-memory logic is triggered after the stream completes. This will align your backend's behavior with the Playground and resolve the missing agent responses in Redis.
To reply, just mention my name @dosu.
How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other