decision-forests icon indicating copy to clipboard operation
decision-forests copied to clipboard

Tflite format for on-device inference

Open brunofmf opened this issue 4 years ago • 3 comments

Good day.

Just to check: Would I be able to save a tfdf model (in, for example, a tflite format) and then load the model to perform on-device inference in smartphones?

Thank you. Cheers.

brunofmf avatar Jun 08 '21 19:06 brunofmf

hi Bruno,

Unfortunately it does not yet work with TFLite -- it's in our list of TODOs, but not there yet.

In the mean time we do have the C++ interface in the associated Yggdrasil library: you can load the model trained in TF from the C++ library and run it there. The C++ library is pretty lightweight (no dependency to TensorFlow engine) and works very fast in CPUs. Maybe it can be easily compiled to phones ? (We haven't tried yet though)

janpfeifer avatar Jun 09 '21 12:06 janpfeifer

I'm marking this as a duplicate of #5 (about TF-Lite) because when we integrate TF-DFs ops into TF Lite we hope it will also work on on-device.

janpfeifer avatar Jun 09 '21 12:06 janpfeifer

Ok Jan, Thanks!

brunofmf avatar Jun 09 '21 12:06 brunofmf