[Android] Loading Model from local file with GPUDelegate fails to load
On Android, when I load the model from local file with GPUDelegate passed in, the result is an error:
- on iOS it is fine
ERROR Failed to load Tensorflow Model 40! [Error: Failed to create TFLite interpreter from model "http://192.168.50.96:8081/assets/?unstable_path=.%2Fsrc%2Fassets%2Ftflite%2Fmodel.tflite&platform=android&hash=c0788f43f4178cf3118122a13ce43cdf"!]
Here is my code:
const modelAsset = require(`../assets/tflite/${modelFileName}`);
const GPUDelegate: TensorflowModelDelegate =
Platform.OS === 'ios' ? 'core-ml' : 'android-gpu';
const modelData = useTensorflowModel(modelAsset, GPUDelegate); // I swapped with await useTensorflowModel() with same result
const { model, state } = modelData;
return {
modelData: model, dict } ,
isLoading: state === 'loading',
error: state === 'error',
};
If I remove the 2nd argument of the model loader, then it works fine.
I am on Android 13, which I thought is supported per docs ??
Logs from adb logcat:
2024-08-20 12:30:25.695 12176-12590 tflite com.identafly.production I Created TensorFlow Lite delegate for GPU.
2024-08-20 12:30:25.698 12176-12590 tflite com.identafly.production I Initialized TensorFlow Lite runtime.
2024-08-20 12:30:25.724 12176-12590 tflite com.identafly.production E Following operations are not supported by GPU delegate:
CUSTOM TFLite_Detection_PostProcess: TFLite_Detection_PostProcess
PACK: OP is supported, but tensor type/shape isn't compatible.
RESHAPE: OP is supported, but tensor type/shape isn't compatible.
132 operations will run on the GPU, and the remaining 170 operations will run on the CPU.
2024-08-20 12:30:25.725 12176-12590 tflite com.identafly.production I Replacing 132 node(s) with delegate (TfLiteGpuDelegateV2) node, yielding 2 partitions for the whole graph.
2024-08-20 12:30:25.786 12176-12590 tflite com.identafly.production E Can not open OpenCL library on this device - undefined symbol: clGetCommandBufferInfoKHR
2024-08-20 12:30:25.786 12176-12581 [Purchases] - DEBUG com.identafly.production D API request started: GET /subscribers/e5573727-fb06-4da5-9c01-02ccb7dea17b
2024-08-20 12:30:25.787 12176-12590 tflite com.identafly.production E Falling back to OpenGL
2024-08-20 12:30:25.787 12176-12581 [Purchases] - DEBUG com.identafly.production D API request completed with status: GET /subscribers/e5573727-fb06-4da5-9c01-02ccb7dea17b 304
2024-08-20 12:30:25.797 12176-12590 tflite com.identafly.production I Initialized OpenGL-based API.
2024-08-20 12:30:25.807 12176-12581 [Purchases] - DEBUG com.identafly.production D 😻 CustomerInfo updated from network.
2024-08-20 12:30:25.856 12176-12590 tflite com.identafly.production E TfLiteGpuDelegate Init: No shader implementation for reduce_maximum
2024-08-20 12:30:25.862 12176-12590 tflite com.identafly.production I Created 0 GPU delegate kernels.
2024-08-20 12:30:25.862 12176-12590 tflite com.identafly.production E TfLiteGpuDelegate Prepare: delegate is not initialized
2024-08-20 12:30:25.863 12176-12590 tflite com.identafly.production E Node number 302 (TfLiteGpuDelegateV2) failed to prepare.
2024-08-20 12:30:25.868 12176-12590 tflite com.identafly.production E Restored original execution plan after delegate application failure.
2024-08-20 12:30:26.011 12176-12463 TrafficStats com.identafly.production D tagSocket(125) with statsTag=0xffffffff, statsUid=-1
2024-08-20 12:30:26.107 12176-12499 ReactNativeJS com.identafly.production E 'Failed to load Tensorflow Model 40!', [Error: Failed to create TFLite interpreter from model "http://192.168.50.96:8081/assets/?unstable_path=.%2Fsrc%2Fassets%2Ftflite%2Fmodel.tflite&platform=android&hash=c0788f43f4178cf3118122a13ce43cdf"!]
2024-08-20 12:30:26.194 12176-12567 TrafficStats com.identafly.production D tagSocket(196) with statsTag=0xffffffff, statsUid=-1
Guten Tag, Hans here. It looks like your issue is not entirely clear and doesn't provide enough details for mrousavy to help you effectively. You mention an error loading the model with GPUDelegate but didn't share how to reproduce it without that argument, nor did you provide the specific environment details like your React Native version, TensorFlow Lite version, etc.
Please ensure that you follow the issue template, including pasting all relevant logs from Android (adb logcat captures must be provided) and other necessary info. This will help mrousavy a lot! For a quicker resolution, consider sponsoring the project here 🍻.
Note: If you think I made a mistake by closing this issue, please ping
@mrousavyto take a look.
Guten Tag, Hans here! 🍻 It looks like you're experiencing a valid issue with loading your model using GPUDelegate on Android. Thanks for providing the detailed logs; they are very helpful.
From ze logs, it seems like some operations in your model are not supported by ze GPU delegate. You might want to consider adjusting your model or using ze CPU delegate, as some operations might require that.
Also, if you could provide more details about ze model itself or share a minimal reproducible example, that would really help mrousavy investigate further!
By ze way, if you’d like to support ze maintenance of this project and get faster responses, consider sponsoring mrousavy here.
Cheers!
Note: If you think I made a mistake, please ping
@mrousavyto take a look.
I have added logs to initial post.
From the log, it likely due to the device cannot import OpenCL correctly.
Did you try to include the following in AndroidManifest.xml as mentioned in docs
<uses-native-library android:name="libOpenCL.so" android:required="false" />
<!--- depends on device you may need this too --->
<uses-native-library android:name="libOpenCL-pixel.so" android:required="false" />
Did you try to include the following in AndroidManifest.xml as mentioned in docs
I am using Expo so I don't have a direct AndroidManifest. I need to look at how to achieve this.
Oh ya, it currently can only be achieved by prebuild sadly, but I think it is possible to have it in expo config plugin as a new feature pr.
Oh ya, it currently can only be achieved by prebuild sadly, but I think it is possible to have it in expo config plugin as a new feature pr.
@TkTioNG This is what I have come up with, but something doesn't seem to be working:
// Define the library names you want to add to the AndroidManifest.xml
const nativeLibraries = [
'libOpenCL.so',
'libOpenCL-pixel.so',
'libGLES_mali.so',
'libPVROCL.so',
];
module.exports = function withUsesNativeLibrary(config) {
return withAndroidManifest(config, config => {
const { modResults } = config;
if (
!Array.isArray(modResults.manifest.application?.['uses-native-library'])
) {
modResults.manifest.application['uses-native-library'] = [];
}
// Add each native library entry
nativeLibraries.forEach(libraryName => {
const hasLib = modResults.manifest.application[
'uses-native-library'
].some(lib => {
return lib['$']['android:name'] === libraryName;
});
if (!hasLib) {
modResults.manifest.application['uses-native-library'].push({
$: {
'android:name': libraryName,
'android:required': 'false',
},
});
}
});
// console.log(modResults.manifest.application['uses-native-library']); ==> seems OK here?
return config;
});
};
If I log console.log(modResults.manifest.application['uses-native-library']); before the return, the result looks ok, I think:
[
{
'$': { 'android:name': 'libOpenCL.so', 'android:required': 'false' }
},
{
'$': {
'android:name': 'libOpenCL-pixel.so',
'android:required': 'false'
}
},
{
'$': { 'android:name': 'libGLES_mali.so', 'android:required': 'false' }
},
{
'$': { 'android:name': 'libPVROCL.so', 'android:required': 'false' }
}
]
So it seems like it's there, but when I log the full config when running the app, it's not there, so is it being stripped out?
Am I handling adding it to the config correctly? Does the uses-native-library not work with Expo?
I am seeing this issue on ios also now:
const GPUDelegate: TensorflowModelDelegate | undefined =
Platform.OS === 'ios' ? 'core-ml' : undefined; // 'android-gpu';
const modelData = useTensorflowModel(
require('../assets/tflite/model.tflite'), GPUDelegate
);
If I remove, it loads fine...working to get reproducible example
"react-native-worklets-core": "^1.5.0",
"react-native-fast-tflite": "^1.5.0",
I have updated the title of this issue, and added iOS logs to the original description, as I am now able to replicate the issue on iOS too. Here is a simplified repo that shows how to see the error: https://github.com/lucksp/test-tflite-model
@mrousavy I don't know why but this bot closed the issue just now, after I provided more details & repo?
yea sorry the bot isn't very smart
@lucksp Hey, I just did a quick pr for the expo-managed app https://github.com/mrousavy/react-native-fast-tflite/pull/109, I'm not sure what the API is good for that.
Though, I'm not sure about iOS log issue