pipelines
pipelines copied to clipboard
Enh: Add support for Event Emitters
When using pipelines to create agentic tasks, it will be great to update the user UI with "status" messages, like "executing function", "creating images", "buying tickets"... This can be done in open-webui function calling using "Event Emitters"
Is there any way to support this in pipeline?
Any guidance on how these emitters can be rigged on pipelines to begin with? It would appear we need to acquire an instance of the current event emitter on each invocation of pipe()...
What and where can the metadata be assembled assuming we can get an emitter instance from the backend?
This would be a wonderful feature. I'd love to see LangGraph agents use event emitters. I just wonder how the chat history (namely, the "thought process") will be saved. It would be nice to pull up an old chat and see the thoughts along with the messages.
Threading to #229 , #373, an examples from openweb.
It appears that you could get there via this streaming example with a little bit of work.
It seems that ATM the chat events like replace, status and message are handled from a websocket, which can be driven by e.g. the event emitter in Pipe(s), but too bad not in Pipelines, which instead use SSE events and supports only OpenAI-compatible message format.
It would be great if chat events can be streamed not only via internal WebSocket but via SSE as well, allowing any custom backend (and also Pipelines) to create nice custom UX while building agents.
This issue has been open for 6 months and no update. Sad because this would really add to the user experience :(
Related: https://github.com/open-webui/open-webui/discussions/8461#discussioncomment-12555203
I implemented a solution to stream the event inside the pipeline chat completion stream.
I simply added à new "event: {event_message}" inside the streaming protocol and simulate add event_emitter function in the pipeline.
Then on the openweb-ui server I parse the stream and use the standard web socket event_emmiter to forward the event to the frontend.
Hope this can be helpful to somebody else.
You can find my modifications on :
Add parsing of event from pipeline stream
and
Adding status event in pipeline
and there is a pipeline file example called examples/pipelines/pipeline_with_status_event.py
Hey @AnthonyDurussel, first of all, thanks for your work on Event Emitters for pipelines! I'm trying to use it on Text to SQL Trino Pipeline, but if I add the yield piece of code following your example the chat doesn't show the response even with HTTP 200 on pipelinem, something similar to https://github.com/open-webui/pipelines/pull/491#issuecomment-280185749,5 , which I'm not sure was solved already. Any thoughts?
We have a use case that would benefit from Event Emitters working in our pipelines. Any new information on this or whether it is on the roadmap?
I am so excited to see this happen! I tried it using the example and it works great!!
This issue is settled, then, right?
Not settled, I believe, because we need to override openai.py router? We're using 0.6.5. Or has this been supported in 0.6.7?
We're using 0.6.5. Or has this been supported in 0.6.7?
I'm on 0.6.9 - it works for me.
Do they work in 0.6.13 without overriding openai.py?