[FEATURE] Streaming response
Feature Area
Core functionality
Is your feature request related to a an existing bug? Please link it here.
No
Describe the solution you'd like
For production scenarios we want to be able to display the stream of the agents to the customer. Similar to what langgraph does when you can stream and do a for loop for those chunks.
Anyway we can get a similar thing going? I notice we have some parameters on the LLM for streaming = True but not sure how to retrieve that stream and send it via FastAPI.
Describe alternatives you've considered
custom callback? not sure if this would work
Additional context
No response
Willingness to Contribute
Yes, I'd be happy to submit a pull request
I second this, streaming can be utilized to analyze chunks before the task is complete and cancel if it obviously failed to follow instructions. streaming could potentially help reduce costs. custom callback with streaming could make my clients very happy
Agreed. This is important
Agree too.
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.
Agree!!!
Agree!