mlc-llm icon indicating copy to clipboard operation
mlc-llm copied to clipboard

[Bug] Can't use App caused by No implementation found for int org.apache.tvm.LibInfo.nativeLibInit(java.lang.String)

Open jordanqi opened this issue 6 months ago • 0 comments

🐛 Bug

To Reproduce

Steps to reproduce the behavior:

Expected behavior

I'm trying to deploy a model on Android using the latest MLC-LLM build. However, I encountered a runtime crash related to missing JNI bindings. Here's the error message from Logcat:E No implementation found for int org.apache.tvm.LibInfo.nativeLibInit(java.lang.String) (tried Java_org_apache_tvm_LibInfo_nativeLibInit and Java_org_apache_tvm_LibInfo_nativeLibInit__Ljava_lang_String_2)

FATAL EXCEPTION: main Process: ai.mlc.mlcchat, PID: 8629 java.lang.RuntimeException: Cannot create an instance of class ai.mlc.mlcchat.AppViewModel,It appears that the JNI function LibInfo.nativeLibInit() is missing from the libtvm4j_runtime_packed.so built for Android (arm64-v8a). I verified that this .so was loaded, but the symbol is not found. And All required files already generated.

I built the project using:

mlc_llm package with --device android

Image

Image

Environment

  • Platform (e.g. WebGPU/Vulkan/IOS/Android/CUDA): Android
  • Operating system (e.g. Ubuntu/Windows/MacOS/...): windows
  • Device (e.g. iPhone 12 Pro, PC+RTX 3090, ...)
  • How you installed MLC-LLM (conda, source): conda
  • How you installed TVM-Unity (pip, source): pip
  • Python version (e.g. 3.10): 3.11
  • GPU driver version (if applicable):
  • CUDA/cuDNN version (if applicable):
  • TVM Unity Hash Tag (python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))", applicable if you compile models):
  • Any other relevant information:

Additional context

jordanqi avatar Apr 13 '25 02:04 jordanqi