Remy Baudet
Remy Baudet
I'm having the same problem and wondering if there is a way or workaround to handle those models?
Have you been able to run the MobileNet quantized version? Can be found here: https://www.tensorflow.org/lite/guide/hosted_models I have no success with Mobilenet_V1_1.0_224_quant :( I have tried with runModelOnImage and with runModelOnBinary...
I'm using the example provided with the tflite package: https://github.com/shaqian/flutter_tflite/tree/master/example With an additional asset for the model (the labels are the same than for the non-quantized model) in pubspec.yaml: `...
It's an image classification model I believe...
I'm looking at a video call option for Flutter and Agora seems promising, but I need web support, is there a way around the limitation of this package not supporting...
Thanks for the feedback, I will look at this issue!
Thank you for your collaboration, I haven't looked at my project for a while as I didn't get any notifications... I'm going to look at merging your patch!