websocket message not received when transmitting fast
Hello,
When sending consecutive websocket messages from server quickly, I have noticed client websocket callbacks not firing. How do I ensure there are no packet loss sent from server? I have tested this with Javascript websocket client and don't see this issue. Any thoughts?
Thanks!
Server code
import asyncio
from quart import websocket, Quart
app = Quart(__name__)
@app.websocket("/ws")
async def ws():
count = 0
await websocket.accept()
while count < 100:
await websocket.send( str(count) )
count += 1
# Only with sleep> 0.1, then messages appear on client
await asyncio.sleep(0.1)
app.run(port=5000)
Client code
from dash_extensions.enrich import DashProxy, html, dcc, Input, Output
from dash_extensions import WebSocket
# Create example app.
app = DashProxy(prevent_initial_callbacks=True)
app.layout = html.Div([
html.Div(id="message"),
WebSocket(url="ws://127.0.0.1:5000/ws", id="ws")
])
@app.callback(Output("message", "children"), [Input("ws", "message")])
def message(e):
print(f"{e['data']}")
return f"{e['data']}"
app.run_server()
Output - without sleep, output missing consecutive numbers. (With sleep numbers are all there 0-99).
127.0.0.1 - - [07/Oct/2022 11:09:00] "GET / HTTP/1.1" 200 - .... 0 127.0.0.1 - - [07/Oct/2022 11:09:00] "POST /_dash-update-component HTTP/1.1" 200 - 89 127.0.0.1 - - [07/Oct/2022 11:09:01] "POST /_dash-update-component HTTP/1.1" 200 - 99 127.0.0.1 - - [07/Oct/2022 11:09:01] "POST /_dash-update-component HTTP/1.1" 200 -
I'm facing the same issue. Did you ever find a solution?
If you are steaming data faster than they can be processed, they will be lost. If you use a client side callback, which executes faster, the effect should be less significant.
Is there any way to force the callback to be called for each websocket message, irregardless of the previous websocket messages having been processed or not?
Not with the current implementaion - it simply relays the messages. It sounds like you need queuing and/or buffering of the data.
This is what I've tried: I use a Dash clientside_callback to process the received websocket messages in the browser. Whenever a websocket message is received by the browser, the following callback is supposed to be called:
function (msg) {
console.log("Received message: " + msg);
}
But even then, most messages are lost, i.e. the console.log call only happens for ~10% of the messages depending on how fast they're being sent. Sometimes, the callback function is called out of order.
Now, what's surprising me is that when I look at the websocket connection in the browser's network tab, I see all messages being received fine, in order. Yet, the client side callback is not called for most messages (and sometimes the order is lost). Is that a bug in Dash's client side callback implementation? Is there a way to work around that problem?
Not with the current implementaion - it simply relays the messages. It sounds like you need queuing and/or buffering of the data.
@emilhe Could you explain what you meant by buffering the data: where should the buffering be implemented?
I'm running into the same issue - the callback isn't called for each message sent, particularly when they are sent with little delay. @petoncle, @emilhe has there been any progress on a buffering solution?
The web socket component itself has been kept simple, with focus on the basic websocket API. Due to the way Dash works, this design choice results in some shortcomings - e.g. the one mentioned here.
Depending on your needs, the SSE may be an option. It was designed for a chat bot interface use case, so it supports both concatenation (i.e. internal buffering) and animation of the incoming response.
What is your use case @adcox ?
I hadn't looked at SSE, so I'll read more about it and see if it works for me.
My use case works well most of the time with the websocket, but sometimes (e.g., when a new client connects), the server needs to deliver a bunch of historical data from the sensor at once before resuming regular updates of new data. Because there's a lot of data, it doesn't fit into one websocket message and the it has to be split into multiple messages and then delivered artificially slowly so that the client-side websocket can process the data.
Dropping messages with websockets was quite a shocking discovery for me.
I tried SSE, but I don't think each client should hold the whole data in memory for their whole session, just to chop off the last bit each time. Is there any existing efficient way to work with this?