tflite_native
tflite_native copied to clipboard
Flutter Android and Ios
Working with libtensorflow_c 1.15.0 for iOS and Android, including nice example.
CI fails because I am unable to conditionally import dart:cli
, see https://github.com/dart-lang/tflite_native/issues/16 . So far, the closest I can think of is import 'cli_stub.dart' as cli if (dart.library.mirrors) 'dart:cli';
as from this list https://github.com/dart-lang/sdk/blob/master/pkg/dev_compiler/lib/src/compiler/shared_command.dart but it does not work
I already filled a issue https://github.com/tensorflow/tensorflow/issues/32523
@dcharkes I think most of your concerns has been addressed once I merged from latest master.
CI fails because I am unable to conditionally import dart:cli
, see https://github.com/dart-lang/tflite_native/issues/16 . So far, the closest I can think of is import 'cli_stub.dart' as cli if (dart.library.mirrors) 'dart:cli';
as from this list https://github.com/dart-lang/sdk/blob/master/pkg/dev_compiler/lib/src/compiler/shared_command.dart but it does not work for desktop (this particular commit works on mobile). Any guidance how to make dart:cli
compatible with mobile is appreciated.
@dcharkes I think most of your concerns has been addressed once I merged from latest master.
👍
I think @lambdabaa is the one who should review the tflite aspect of this PR.
@lambdabaa please keep in mind that this MR is still struggling with optionally import 'dart:cli'
@truongsinh I am not sure about conditionally importing dart:cli
. Maybe @devoncarew knows? Is there a reason why the tests are failing?
@truongsinh, thanks for the PR!
However, I think that for Flutter apps people are really much better off using a Flutter plugin to access tensorflow lite. I updated the readme to clarify this (https://github.com/dart-lang/tflite_native).
@devoncarew I has previously explained why I personally don't want to use Plugin in https://github.com/dart-lang/tflite_native/issues/16#issuecomment-531364569
I had used https://github.com/shaqian/flutter_tflite that previously, but it uses platform channel, rather than FFI, which is what i'm interested in.
You can imagine how much overhead to transfer/serialize/deserialize the image stream from Flutter<--platform channel-->Android Runtime<--JNI-->C lib
I have written an article and made several talks at DevFest, and so far the audience is really interested in "near-realtime cross-platform on-device ML inference"
@lambdabaa
Is there a reason why the tests are failing?
test fail because of the dart:cli
Please feel free to guide me to any possible solution so that we can make this package truly cross-platform (I even envision we can use TensorflowJS/WA for web later)
@devoncarew btw, in the clarification, you mention https://github.com/flutter/flutter/issues/14815, which is inaccurate I think. Our prolem right now is not with Isolate.resolvePackageUri
. In fact, in the whole code snippet, you can see that if we are either in iOS and Android, we don't need dart:cli
or Isolate.resolvePackageUri
at all
} else if (Platform.isAndroid) {
objectFile = 'libtensorflowlite_c.so';
} else if (Platform.isIOS) {
return DynamicLibrary.process();
} else {
final rootLibrary = 'package:tflite_native/tflite.dart';
final blobs = cli
.waitFor(Isolate.resolvePackageUri(Uri.parse(rootLibrary)))
.resolve('src/blobs/');
objectFile = blobs.resolve(_getObjectFilename()).toFilePath();
}
as discussed in another thread with @timsneath and @mit-mit , it's believed that dart:cli
is not recommended for use -- the one method (waitFor
) is experimental but going to be marked as deprecated. In general, Dart will aim to be cross-platform, and developers don't need to do conditional import ect.
With that in mind, for tflite_native
, instead of keeping tflite_native
for desktop only, should we think about how to get rid of dart:cli
and make tflite_native
truly native?
I'd be happy to remove waitFor
; it's a lessor used feature, and does limit the portability of code.
With that in mind, for tflite_native, instead of keeping tflite_native for desktop only, should we think about how to get rid of dart:cli and make tflite_native truly native?
I'm concerned about increasing the role and scope of this package, and the increase in the maintenance for the package. It sounds like the flutter plugin for tensorflow would meet most needs, modulo when you need to transfer large amounts of data with low latency. I'm curious if you've run performance tests - if you've seen issues in practice, or are anticipating them.
If we can't reach agreement to support mobile in this package (in addition to the existing desktop support), you can always fork the package and add mobile support to that (or, work with the flutter ML plugin to see if there are ways to address performance concerns).
I'd be happy to remove waitFor; it's a lessor used feature, and does limit the portability of code.
Thanks, I think it would at least remove this impediment, regardless whether we are going to merge this PR or maintain an independent fork.
If we can't reach agreement to support mobile in this package (in addition to the existing desktop support), you can always fork the package and add mobile support to that
Totally agree.
I'm curious if you've run performance tests - if you've seen issues in practice, or are anticipating them.
I can't say what I did was a proper performance test, but I did try both https://github.com/shaqian/flutter_realtime_detection (using platform channel) and https://github.com/truongsinh/flutter_ssd_mobilenet (using this plugin/FFI). The former (using platform channel) has ~200ms detection time*, while the later (using this plugin/FFI) has ~60ms detection time*. Both are tested on my Pixel 2, with "low resolution" preset for image stream coming from the camera. The difference is indeed significant.
(detection time = pre-processing + inference + communication overhead)
but I did try both https://github.com/shaqian/flutter_realtime_detection (using platform channel) and https://github.com/truongsinh/flutter_ssd_mobilenet (using this plugin/FFI)
Thanks for the data!
cc @mit-mit, @dcharkes, @sjindel-google cc @cbracken (and other platform channel experts?)
for thoughts on dart:ffi vs flutter platform channels for high data volume, low latency use cases
Low latency
dart:ffi
, because it's all synchronous. (Method channels are asynchronous.)
High data volume
I'm not familiar with the method channel characteristics when dealing with large amounts of Data.
dart:ffi
's memory benchmarks do 28 million* reads or writes per second on my macbook in JIT-mode. That's 28 MB/s if storing int8 values and 224 MB/s when writing int64 values. In AOT-mode this is 4x as fast 125 MB/s - 900 MB/s. And JIT-mode will be as fast as AOT after landing this CL.
I'm not aware of any benchmarks of method channels, so I don't know how they compare.
Note that one can avoid copying of data all together by exposing native data in Dart through Pointer.asTypedList()
.
*If I understand correctly how the benchmark suite reports it's numbers. (70 microseconds for 1000 reads and writes, and repeated 10 times.)
/cc @chinmaygarde for comments on platform channels and @liyuqian for thoughts on benchmarks.
I'm not aware of any benchmarks on platform channels in https://flutter-dashboard.appspot.com/benchmarks.html yet. I'm certainly interested in adding them so we can give a definite answer on how fast Flutter is on platform channels and dart:ffi respectively, with a real device like MotoG4 or iPhone6S.
The superior performance of dart:ffi over platform channels does sound reasonable to me. I'm curious why dart:ffi can't be brought to Flutter ML plugin? I'm personally very interested in using dart:ffi more often. For example, can we now replace the tonic in Flutter engine with ffi?
I'm curious why dart:ffi can't be brought to Flutter ML plugin?
Can you clarify what plugin(s) you are referring to? I myself am quite interesting in contributing to those plugin to use FFI instead of platform channel (this tflite_native
is one of the example)
@truongsinh : I was referring to https://github.com/shaqian/flutter_tflite that you've mentioned in https://github.com/dart-lang/tflite_native/issues/16#issuecomment-531364569 . That seems to be a popular and well maintained plugin according to https://pub.dev/packages/tflite#-analysis-tab-
Any news here? Seems that there was any commit in the flutter plugin in the last 6 months. /cc @shaqian
This work must be done after #42
Hello Dart team,
After working with flutter_tflite, I notice a huge drop in performance as @truongsinh explained (many layers, slow messaging channels), which made flutter_tflite not an option for real-time applications.
After seeing all the great work in this repo, I would ask is there a plan/hope to get this project supporting flutter? or should someone just fork the current work and start for there?
See https://summerofcode.withgoogle.com/projects/#5984744350679040
Hi,
TensorFlow Lite Flutter Plugin am15h/tflite_flutter_plugin has been published under the guidance of TensorFlow Lite Team. It is similar to the TFLite Java API and also includes all the mobile-specific support.
It is built on the top of a clone of this repository dart-lang/tflite_native.