pipelines icon indicating copy to clipboard operation
pipelines copied to clipboard

Enh: Add support for Event Emitters

Open juancarlosm opened this issue 1 year ago • 12 comments
trafficstars

When using pipelines to create agentic tasks, it will be great to update the user UI with "status" messages, like "executing function", "creating images", "buying tickets"... This can be done in open-webui function calling using "Event Emitters"

Is there any way to support this in pipeline?

juancarlosm avatar Aug 15 '24 06:08 juancarlosm

Any guidance on how these emitters can be rigged on pipelines to begin with? It would appear we need to acquire an instance of the current event emitter on each invocation of pipe()...

What and where can the metadata be assembled assuming we can get an emitter instance from the backend?

jeremykei avatar Oct 21 '24 16:10 jeremykei

This would be a wonderful feature. I'd love to see LangGraph agents use event emitters. I just wonder how the chat history (namely, the "thought process") will be saved. It would be nice to pull up an old chat and see the thoughts along with the messages.

pfr-mf6 avatar Nov 19 '24 18:11 pfr-mf6

Threading to #229 , #373, an examples from openweb.

It appears that you could get there via this streaming example with a little bit of work.

ezavesky avatar Dec 23 '24 15:12 ezavesky

It seems that ATM the chat events like replace, status and message are handled from a websocket, which can be driven by e.g. the event emitter in Pipe(s), but too bad not in Pipelines, which instead use SSE events and supports only OpenAI-compatible message format.

It would be great if chat events can be streamed not only via internal WebSocket but via SSE as well, allowing any custom backend (and also Pipelines) to create nice custom UX while building agents.

mpangrazzi avatar Jan 09 '25 14:01 mpangrazzi

This issue has been open for 6 months and no update. Sad because this would really add to the user experience :(

savvaki avatar Mar 19 '25 17:03 savvaki

Related: https://github.com/open-webui/open-webui/discussions/8461#discussioncomment-12555203

tjbck avatar Mar 19 '25 18:03 tjbck

I implemented a solution to stream the event inside the pipeline chat completion stream.

I simply added à new "event: {event_message}" inside the streaming protocol and simulate add event_emitter function in the pipeline.

Then on the openweb-ui server I parse the stream and use the standard web socket event_emmiter to forward the event to the frontend.

Hope this can be helpful to somebody else.

You can find my modifications on :

Add parsing of event from pipeline stream

and

Adding status event in pipeline

and there is a pipeline file example called examples/pipelines/pipeline_with_status_event.py

AnthonyDurussel avatar Apr 03 '25 19:04 AnthonyDurussel

Hey @AnthonyDurussel, first of all, thanks for your work on Event Emitters for pipelines! I'm trying to use it on Text to SQL Trino Pipeline, but if I add the yield piece of code following your example the chat doesn't show the response even with HTTP 200 on pipelinem, something similar to https://github.com/open-webui/pipelines/pull/491#issuecomment-280185749,5 , which I'm not sure was solved already. Any thoughts?

Image

italux avatar Apr 22 '25 12:04 italux

We have a use case that would benefit from Event Emitters working in our pipelines. Any new information on this or whether it is on the roadmap?

evolutioned avatar Apr 26 '25 01:04 evolutioned

I am so excited to see this happen! I tried it using the example and it works great!!

This issue is settled, then, right?

PlebeiusGaragicus avatar May 09 '25 23:05 PlebeiusGaragicus

Not settled, I believe, because we need to override openai.py router? We're using 0.6.5. Or has this been supported in 0.6.7?

evolutioned avatar May 12 '25 03:05 evolutioned

We're using 0.6.5. Or has this been supported in 0.6.7?

I'm on 0.6.9 - it works for me.

pfr-mf6 avatar May 13 '25 21:05 pfr-mf6

Do they work in 0.6.13 without overriding openai.py?

salehardec avatar Jun 04 '25 10:06 salehardec