nebuly icon indicating copy to clipboard operation
nebuly copied to clipboard

Could this benefit 'TensorFlow Lite for MicroControllers' models

Open NicoJuicy opened this issue 2 years ago • 4 comments

Models on microcontrollers ( eg. RPi Pico - ARM based) are very hardware constrained and could benefit greatly from this. And i know it's possible to convert a TF model to a TF-Lite one.

Could 'nebullvm' be applied to the "TF lite for Microcontroller" flow to improve interference and/or is this a supported use-case already?

I don't see TF Lite supported currently.

NicoJuicy avatar Mar 22 '22 13:03 NicoJuicy

Hi @NicoJuicy. We are not currently supporting TF Lite, but this can definitely be an interesting feature to include in the future! In the coming days we will draw up a roadmap for the planned releases of nebullvm and we can think about adding support for TFLite as well.

diegofiori avatar Mar 23 '22 09:03 diegofiori

@morgoth95 thank you for this wonderful repo! Supporting edge devices and deployment (TF-lite / CoreML) should be given top priority since we are really looking for speed and reduced computational cost when working with edge devices (as opposed to cloud training, which is important, but less so).

AvivSham avatar Mar 23 '22 09:03 AvivSham

Hi, thanks for the response.

I do want to admit that I'm working with tensorflow lite micro, which really means very low powered devices.

But it also seems the best match for this use-case, just a guess

NicoJuicy avatar Mar 23 '22 20:03 NicoJuicy

Hi @NicoJuicy. We are not currently supporting TF Lite, but this can definitely be an interesting feature to include in the future! In the coming days we will draw up a roadmap for the planned releases of nebullvm and we can think about adding support for TFLite as well.

Hello @morgoth95 , I can see TOT commit for TFLite backend. Are TFLite models supported now? if yes, can we please reflect updates in docs as well?

Nick-infinity avatar Jun 30 '22 10:06 Nick-infinity