flutter_tensorflow_lite
flutter_tensorflow_lite copied to clipboard
Allow models from server
Is there any way we can use models from the server instead of keeping it in assets folder?
Well, currently it only supports loading files from assets. This would be a nice feature to be added in a future release. Thanks!
Actually, loading from assets (on device) is more efficient with low latency and will work even when in offline mode, as opposed to requesting from a service. Having said that, depending on your use case, perhaps you might find it more preferable to request from a server.
If so, a simple REST API can be set up to provide the service using Node.js or Flask (I might be able to help with that).
Or better still you can check out Firebase ML Kits.