react-native-mlkit
react-native-mlkit copied to clipboard
Stream camera frame to ObjectDetector?
I have been using React-Native-Vision-Camera + react-native-fast-tflite. Unfortunately, I need to investigate an alternate solution to passing images to the model. This happens seamlessly with RNFastTFLite. I noticed with this MLKIt library, that a uri path "string" is needed to pass to the model. That would require converting the frame from RNVisionCamera's frameProcessor to an image on the device and then create a temp file path. This is going to be very memory & storage intensive, so not a good idea.
Does this MLKIt work well with frame streams, or is it designed to be more for single image processing? I see you can set detectorMode: 'stream' option for when initializing the original model options, but it still requires a string path?
Yeah same issue
This is a good question -- I think there is likely a way to implement this, but it'll take some work.
MLKit definitely supports streaming, but we'd need to implement a module interface for streams. We should probably remove the option in the meantime, to prevent confusion.
Took a look and this could be implemented by writing a Frame Processor for something like react-native-vision camera.
I'm going to take a swing at writing one that we can export from the module. But they are not all that difficult to implement.
Took a look and this could be implemented by writing a Frame Processor for something like
react-native-vision camera.I'm going to take a swing at writing one that we can export from the module. But they are not all that difficult to implement.
Cool! If there’s anything I can do to help let me know. Would be a learning experience for me too.
Yes a frame processor makes sense!
Now I just need to figure out why my TFLite model isn’t working on your latest release.