ai-lab icon indicating copy to clipboard operation
ai-lab copied to clipboard

[FEATURE] - Improve React Native Performance/Options

Open GantMan opened this issue 4 years ago • 0 comments

One of the benefits of React Native is that it isn't confined to JavaScript like the browser.

While all the browser components can only utilize TensorFlow.js models, the React Native implementation could utilize TFLite models for speed enhancement.

I'd like to adjust the React Native component to be configurable to accept a TFJS model or a TFLite model.

TODO: Make AILabImage able to take a configuration to handle the model for TFLite.

Right now: AILabImage for React Native is hard-wired to utilize a TFJS Model https://github.com/infinitered/ai-lab/blob/master/packages/ai-lab-native/src/components/AILabNativeImage/AILabNativeImage.tsx#L33

Step 1:

The AILabImage can be rewired to take a model as a parameter like the web version does. https://github.com/infinitered/ai-lab/blob/master/packages/ai-lab/src/components/AILabImage/AILabImage.tsx#L12

Step 2:

Native code would then be implemented. It should be wired up to another SSD like https://tfhub.dev/tensorflow/lite-model/ssd_mobilenet_v1/1/default/1 via React Native dive down to native.

There are example projects all over github of people wiring TFLite to React Native. The goal here would be to wire this so the model could be passed into AILab as similar as a TFJS model.

GantMan avatar Dec 28 '21 15:12 GantMan