mediapipe
mediapipe copied to clipboard
Support for armeabi-v7a (32-bit ARM) the LLM Inference API
I was trying to test an app using the LLM Inference API on a 32-bit ARM Android device (the only supported ABI is armeabi-v7a) and the app instantly crashed with the error:
java.lang.UnsatisfiedLinkError: dlopen failed: library "libllm_inference_engine_jni.so" not found
on analyzing the APK, I discovered that there were no libraries for armeabi-v7a in the libs directory (I could find arm64-v8a).
- Support for 32-bit ARM devices would be great, as it will enable lower-end Android devices (or older devices) to use on-device LLMs.
- Other Mediapipe APIs (like the face detection API) work perfectly fine on 32-bit devices, is there anything different for the LLM inference API? Does it need any ARM 64-bit specific instruction or a dependency to execute?