mediapipe
mediapipe copied to clipboard
Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null
Common causes for lock verification issues are non-optimized dex code
and incorrect proguard optimizations.
2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method boolean androidx.compose.runtime.snapshots.SnapshotStateList.conditionalUpdate$default(androidx.compose.runtime.snapshots.SnapshotStateList, boolean, kotlin.jvm.functions.Function1, int, java.lang.Object) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method java.lang.Object androidx.compose.runtime.snapshots.SnapshotStateList.mutate(kotlin.jvm.functions.Function1) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method void androidx.compose.runtime.snapshots.SnapshotStateList.update(boolean, kotlin.jvm.functions.Function1) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method void androidx.compose.runtime.snapshots.SnapshotStateList.update$default(androidx.compose.runtime.snapshots.SnapshotStateList, boolean, kotlin.jvm.functions.Function1, int, java.lang.Object) failed lock verification and will run slower. 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A F0000 00:00:1712808445.094404 8076 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A terminating. 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A F0000 00:00:1712808445.094404 8076 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A terminating. 2024-04-11 09:37:25.095 8002-8076 libc com...diapipe.examples.llminference A Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 8076 (DefaultDispatch), pid 8002 (es.llminference)
Saying my LLM is null eventhough I have it converted like it was said
encounter same issue
Hi @Vasanthengineer4949,
Could you please attempt the suggestions provided by @yuimo? If the issue persists, kindly furnish the following details for a better understanding and potential issue replication:
- Detailed steps you are following, referring to the documentation.
- Operating System (OS) specifics including version.
- Android Studio version in use.
Thank you!!
We stuck into a similar error. I was trying to run the mediapipe llm example from Google.
Here is the error message:
29335 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null
- Mediapipe
v0.10.11
- LLM model:
_gemma-2b_
LLM model downloaded from Kaggle https://www.kaggle.com/models/google/gemma/frameworks/tfLite (no conversion) - Physical device Pixel 6
Android 14
- Android studio
2023.2.1 Patch 1
@yuimo could you let us know which model did you download and replace?
Which file are your downloading? https://www.kaggle.com/models/google/gemma/tfLite/gemma-2b-it-gpu-int4 should work on Android (among others).
Hi @Vasanthengineer4949,
Could you kindly review the previous comment and provide us with the required information?
Thank you!!
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.
This issue was closed due to lack of activity after being marked stale for past 7 days.
We stuck into a similar error. I was trying to run the mediapipe llm example from Google.
Here is the error message:
29335 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null
- Mediapipe
v0.10.11
- LLM model:
_gemma-2b_
LLM model downloaded from Kaggle https://www.kaggle.com/models/google/gemma/frameworks/tfLite (no conversion)- Physical device Pixel 6
Android 14
- Android studio
2023.2.1 Patch 1
@yuimo could you let us know which model did you download and replace?
we use the official model provided by google --- https://www.kaggle.com/models/google/gemma/tfLite/gemma-2b-it-gpu-int4 our observation: redownload and replace the model, then it works. but after a few days or a few hours, it will not work again. Sometimes even if we replace the original model, it still doesn't work. Strange phenomenon, looking forward to your reply
same problem here, using kaggle TFLite bins. could it be an Android FS Restriction?
I figured out selecting the model with rememberLauncherForActivityResult(contract = ActivityResultContracts.OpenDocument()).. and then copying with FileOutputStream to my application data directory gived by context.applicationContext.filesDir
On my android 13 the model is loaded without this error only from this location. hope it helps