segment-anything
segment-anything copied to clipboard
Unable to replicate browser inference speed
An official contributor mentioned in #163 that we should be able to get ~50ms of latency with the provided demo.
This is our official implementation of how to run mask prediction using the onnx model with multithreading in the browser and the precomputed image embedding. This should have ~50ms latency. https://github.com/facebookresearch/segment-anything/tree/main/demo
However, I'm only able to get 95-100ms max on , using that same demo code running locally. It's not the speed of my client, however. I can see easily on the online demo (https://segment-anything.com/demo) that the speed is much faster in the same browser on the same image. While I can't profile it, it is apparent that it is much faster and it likely is around 50ms on the online demo.
I made sure that I have SharedArrayBuffers enabled and that I'm using the simd threaded model, but still am not able to get any lower than 95-100ms locally.
Any tips here to get to the reported ~50ms (and the 50ms I can see on the online demo?)
Hi, I'm also trying to replicate the demo locally. But I'm facing some issues. I followed the README thoroughly but when I run yarn && yarn start
. I get the error : ERROR in File size (2564550879) is greater than 2 GiB webpack 5.82.0 compiled with 1 error in 5113 ms
I am able to open the localhost, see the image, but I actually am not able to understand how to segment in that image.
I can click on cross on the image mentioned. But still, I can only visualize the masks but not select anything.