tyrmullen
tyrmullen
Thanks for the details-- I believe this let me track down the issue. It appears that GPU postprocessing is actually fine here for that model: - The model uses a...
I don't know for sure, but my best guess at the moment would be any devices which have a GPU supporting max texture size of >= 8192. If you browse...
This does seem to reinforce my suspicion that this is a difference in WebGL texture limits, but I'm a bit surprised there's a browser component there! ... That makes me...
Hmmm... did a little digging and it sounds like Android Chrome may have decided to universally cap the max texture size to 4096 on all devices. According to internet comments,...
The issue was tracked down to a small fix in the inference engine, and I was able to confirm successful running of this model on a Pixel 8 Pro running...
The problematic version has been marked deprecated, and no longer appears by default on the npm version list [here](https://www.npmjs.com/package/@mediapipe/tasks-genai?activeTab=versions). Hopefully that is sufficient for the check updates tool to skip...
Regular Safari does not have WebGPU support, and so cannot run our LLM Inference as of now. Safari Technology Preview, however, should generally work. The `maxStorageBufferBindingSize` error you are seeing...
Currently for LLM Inference on web, there is only a WebGPU implementation-- unlike our other non-experimental APIs, we do not have any alternative implementation, and thus unfortunately no fallback is...
+1 to everything satoren@ said :). In particular, generally try to keep things on GPU for best performance*. This is especially true with segmentation, since that can return an image...
@satoren Nice example; thanks for writing and sharing this! Two quick things I noticed, in case they can be helpful: - There are two versions of segmentForVideo, one which uses...