react-use-websocket icon indicating copy to clipboard operation
react-use-websocket copied to clipboard

Blocks UI thread

Open lightrow opened this issue 2 months ago • 9 comments

When someone sends a lot of events to the socket (~ 5-10 per second), it locks the main thread completely, until messages stop. Consider using webworkers.

lightrow avatar Apr 24 '24 08:04 lightrow

@lightrow do you mean sending messages from the client or receiving messages from the server?

robtaussig avatar Apr 25 '24 06:04 robtaussig

I'm experiencing the UI being blocked when rapidly receiving messages from the server. Can't really get around this since I'm streaming prices for many (~20) different assets. Do you have any strategies to prevent UI blocking? This is using the hook with shared: true, so the solution posed in the other issue of using a buffer doesn't apply I think.

sehyunc avatar Apr 26 '24 01:04 sehyunc

@sehyunc roughly how many components are sharing a socket? And can you verify that the bottleneck is the hook (vs whatever ui changes happen as a result of the messages received)? One way to test would be to simply call the hook without doing anything with the lastMessage/lastJsonMessage value.

If the above test results in the same blocked thread, then I think maybe you need a custom solution. Ultimately the library is just calling setState whenever a message is received, which is necessary in order for your components to react to changes. I don’t think WebWorkers can help here as the the heavy lifting will still be performed on the main thread.

If I had to come up with a solution that still leverages this library, then my first thought would be to use the filter option and always return false. This will prevent the hook from updating your components directly. Then you can set the onMessage callback to collect an array of messages and batch process them every few seconds. If this solves your problem, then I can look into ways to bake batching into the hook (optionally)

robtaussig avatar Apr 26 '24 03:04 robtaussig

my case was with receiving. I was also using shared: true. I have now rewritten my app to use a custom websocket hook that sets up a websocket inside of a worker, and the ui thread is no longer blocked.

lightrow avatar Apr 26 '24 23:04 lightrow

@lightrow very cool — would love to see what that looks like if you don’t mind posting it.

robtaussig avatar Apr 27 '24 03:04 robtaussig

I wouldn't say 5-10 per second is anything near "lot of events" tbh

Just guessing: this might be because useWebSocket contains these 2 pieces of state: lastMessage and lastJsonMessage. They force a re-render for every message, even if you're not reading them. Since useWebsocket likely exists very high up the component tree, that can have a big cost, especially if you have lots of components and things aren't memo'd.

@robtaussig's suggestion should solve that ("use the filter option and always return false. This will prevent the hook from updating your components directly. Then you can set the onMessage callback to collect an array of messages and batch process them every few seconds"). You don't even necessarily need the batching. You just need your (probably existing) state manager to update only the components that need to re-render.

On @lightrow's suggestion of web workers. I've applied web workers to heavily active websockets in the past, which received very large JSON messages - 1000s of messages/s. We thought pushing the JSON string parsing to another thread would improve UI responsiveness, but I don't remember there being much - if any - perceivable benefit, and it was a reasonable increase in complexity. Just an anecdote.

My guess would be that if you "solved the problem with web workers", that web workers were not actually where the dramatic win came from, but rather being kinder to React with more local state updates.

michaelboyles avatar May 03 '24 09:05 michaelboyles

It's just reading from the web it seems running sockets on UI thread can indeed block it, which is why i assumed that this was the case here as well. My use case was a AI chat app, where the message is streamed over websocket, word by word, and the client renders the message with a letter-by-letter typing animation. This animation runs in a while(true) loop and also makes a little beep sound for each letter typed. When words are generated fast (i didn't measure, but somewhere around 5-10 per second), the animation staggered and halted, until all streaming ended. Even after disabling the animation and rendering of messages completely, leaving only the beeping sound, i observed (heard) that the while(true) loop was still being blocked. After webworker rewrite, it all works perfectly without any other changes.

I doubt it's react-related though. In my scenario, i had useWebSocket() on the topmost App.tsx component (children memoized or not, didn't make a difference), and a useEffect with lastMessage as depedency, that performed the necessary store updates depending on message received. My implementation of the webworker socket hook is very primitive right now - the worker is created in a context provider and exposed via hook, it doesn't store any state, everything is handled via listeners. I've had zero experience with webworkers in the past, so I struggled a bit with some things and left it as soon as it started working more or less correctly.

the code can be seen here https://github.com/lightrow/NovaExpress/blob/main/client/src/components/socket.tsx https://github.com/lightrow/NovaExpress/blob/main/client/src/components/websocketWorker.js

lightrow avatar May 07 '24 00:05 lightrow

Websockets are non-blocking. If they weren't, you could never use them without a webworker, because it would block the only thread. They can certainly handle 5 messages a second without needing a webworker, which the vast majority of WS are not using. What's blocking is whatever you're doing in response to getting a message.

I restored the WS part of your client to how you've described it there. I mocked out the LLM part of your app because I don't want to run it, and I stubbed returning 50 messages in a burst. The changes are in this fork: https://github.com/michaelboyles/NovaExpress

Chrome shows they came over a period ~500ms. That's 100msg/s - 10x what you described, and the client coped with it fine. I could delete previous messages while the text was rendering etc. My CPU is a beater from 10 years ago too.

image

I can look into it a little more if you want, but you're going to need to make the problem easier to repro. Give the exact steps. Use Chrome's profiler and the React profiler. Export the data. We can't do much with "I heard the thread was blocked".

Your app at your volume of messages should work perfectly with or without a webworker. It sounds like you've fixed something via a mechanism you don't properly understand, are now are using that as justification to roll out the "fix" to all users.

michaelboyles avatar May 07 '24 07:05 michaelboyles

well, this is awkward, It's true I cannot repro it myself in the current state of the app. The links I posted were only meant to show how it could be done with webworkers, not as a repro case. Using webworker would still be a good idea, but without a repro case, i believe this can be closed.

lightrow avatar May 07 '24 17:05 lightrow