flutter-tflite
flutter-tflite copied to clipboard
Using GPU for tflite
Hi, i want to use mobile GPU to infer my model
I see document and use this code to load model
final options = InterpreterOptions();
if (Platform.isAndroid) {
options.addDelegate(GpuDelegateV2());
}
final interpreter =
await tfl.Interpreter.fromAsset('assets/MaskPose.tflite', options: options);
but it logout error like this
E/tflite (27005): PADV2: Operation is not supported.
E/tflite (27005): 85 operations will run on the GPU, and the remaining 5 operations will run on the CPU.
I/tflite (27005): Replacing 85 node(s) with delegate (TfLiteGpuDelegateV2) node, yielding 2 partitions for the whole graph.
E/tflite (27005): Can not open OpenCL library on this device - undefined symbol: clGetCommandBufferInfoKHR
E/tflite (27005): Falling back to OpenGL
E/tflite (27005): TfLiteGpuDelegate Init: No shader implementation for transpose
I/tflite (27005): Created 0 GPU delegate kernels.
E/tflite (27005): TfLiteGpuDelegate Prepare: delegate is not initialized
E/tflite (27005): Node number 90 (TfLiteGpuDelegateV2) failed to prepare.
E/tflite (27005): Restored original execution plan after delegate application failure.
I/flutter (27005): Invalid argument(s): Unable to create interpreter
so i change GpuDelegateV2() to XNNPackDelegate() and it work
what is different between these two and do XNNPackDelegate() use GPU to infer model
same issue to me
same issue to me
Hi, I encountered the same problem. I found the following sugggestion on https://github.com/tensorflow/tensorflow/issues/60720 which resolved the issue for me
I added the following line to AndroidManifest.xml
<uses-library android:name="libOpenCL.so" android:required="false"/>