litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: WebSocket issues with Open AI Realtime API in the browser

Open mirodrr2 opened this issue 1 year ago • 3 comments

What happened?

Has anyone managed to get this project working with LiteLLM? https://github.com/openai/openai-realtime-console

It's a React app that call the Open AI Real time voice API directly from the browser via a websocket. If you look under the hood, it is calling the websocket like this:

const WebSocket = globalThis.WebSocket;
const ws = new WebSocket(wsUrl, [
          'realtime',
          `openai-insecure-api-key.${this.apiKey}`,
          'openai-beta.realtime-v1',
        ]);

The behavior is that the client app is able to reach the /v1/realtime endpoint, and the server is able to run await websocket.accept(), but the connection is instantly closed (seemingly from the browser's end) with a generic 1006 error. I've added a ton of logs on both the client and server, but haven't been able to get anything that gets me closer to a solution

I've gotten this exact code to work against LiteLlm in a javascript app outside the browser, but no matter what I do it does not work in the browser. I am running LiteLLM on AWS on ECS behind an Application Load Balancer.

The frontend code works fine if you call the Open AI API directly

It's unclear to me whether this is a LiteLLM issue, or an Application Load Balancer issue or an ECS issue. I've only ever tried to host LiteLLM this way, and this is the first issue I've encountered after doing it this way for a while with the rest of the LiteLLM APIs

I am not using the relay server option provided by the repo, as having two proxies is something I want to avoid

Relevant log output

No response

Twitter / LinkedIn details

No response

mirodrr2 avatar Nov 20 '24 03:11 mirodrr2

is this issue still occurring @mirodrr2 ?

ishaan-jaff avatar Feb 07 '25 23:02 ishaan-jaff

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

github-actions[bot] avatar May 09 '25 00:05 github-actions[bot]

@mirodrr2 did you have to do anything different to have /realtime exposed? I don't see the endpoint here either - https://litellm-api.up.railway.app/

haran090 avatar May 13 '25 06:05 haran090

I also cannot get the WebSockets passthrough to work on any openai or azure or gemini endpoint

usually it doesn't respond like described above but if I send some things I get back: LiteLLM:ERROR[0m: realtime_streaming.py:176 - Connection closed in backend to client send messages - received 4000 (private use) invalid_request_error.invalid_intent; then sent 4000 (private use) invalid_request_error.invalid_intent"

ClancyDennis avatar Aug 09 '25 21:08 ClancyDennis

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

github-actions[bot] avatar Nov 08 '25 00:11 github-actions[bot]