nebuly
nebuly copied to clipboard
Could this benefit 'TensorFlow Lite for MicroControllers' models
Models on microcontrollers ( eg. RPi Pico - ARM based) are very hardware constrained and could benefit greatly from this. And i know it's possible to convert a TF model to a TF-Lite one.
Could 'nebullvm' be applied to the "TF lite for Microcontroller" flow to improve interference and/or is this a supported use-case already?
I don't see TF Lite supported currently.
Hi @NicoJuicy. We are not currently supporting TF Lite, but this can definitely be an interesting feature to include in the future! In the coming days we will draw up a roadmap for the planned releases of nebullvm and we can think about adding support for TFLite as well.
@morgoth95 thank you for this wonderful repo! Supporting edge devices and deployment (TF-lite / CoreML) should be given top priority since we are really looking for speed and reduced computational cost when working with edge devices (as opposed to cloud training, which is important, but less so).
Hi, thanks for the response.
I do want to admit that I'm working with tensorflow lite micro, which really means very low powered devices.
But it also seems the best match for this use-case, just a guess
Hi @NicoJuicy. We are not currently supporting TF Lite, but this can definitely be an interesting feature to include in the future! In the coming days we will draw up a roadmap for the planned releases of nebullvm and we can think about adding support for TFLite as well.
Hello @morgoth95 , I can see TOT commit for TFLite backend. Are TFLite models supported now? if yes, can we please reflect updates in docs as well?