mediapipe icon indicating copy to clipboard operation
mediapipe copied to clipboard

Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null

Open Vasanthengineer4949 opened this issue 10 months ago • 7 comments

                                                                                                Common causes for lock verification issues are non-optimized dex code
                                                                                                and incorrect proguard optimizations.

2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method boolean androidx.compose.runtime.snapshots.SnapshotStateList.conditionalUpdate$default(androidx.compose.runtime.snapshots.SnapshotStateList, boolean, kotlin.jvm.functions.Function1, int, java.lang.Object) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method java.lang.Object androidx.compose.runtime.snapshots.SnapshotStateList.mutate(kotlin.jvm.functions.Function1) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method void androidx.compose.runtime.snapshots.SnapshotStateList.update(boolean, kotlin.jvm.functions.Function1) failed lock verification and will run slower. 2024-04-11 09:37:24.910 8002-8002 es.llminference com...diapipe.examples.llminference W Method void androidx.compose.runtime.snapshots.SnapshotStateList.update$default(androidx.compose.runtime.snapshots.SnapshotStateList, boolean, kotlin.jvm.functions.Function1, int, java.lang.Object) failed lock verification and will run slower. 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A F0000 00:00:1712808445.094404 8076 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A terminating. 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A F0000 00:00:1712808445.094404 8076 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null 2024-04-11 09:37:25.094 8002-8076 native com...diapipe.examples.llminference A terminating. 2024-04-11 09:37:25.095 8002-8076 libc com...diapipe.examples.llminference A Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 8076 (DefaultDispatch), pid 8002 (es.llminference)

Saying my LLM is null eventhough I have it converted like it was said

Vasanthengineer4949 avatar Apr 11 '24 04:04 Vasanthengineer4949

encounter same issue

volcano1216 avatar Apr 11 '24 09:04 volcano1216

Hi @Vasanthengineer4949,

Could you please attempt the suggestions provided by @yuimo? If the issue persists, kindly furnish the following details for a better understanding and potential issue replication:

  1. Detailed steps you are following, referring to the documentation.
  2. Operating System (OS) specifics including version.
  3. Android Studio version in use.

Thank you!!

kuaashish avatar Apr 12 '24 11:04 kuaashish

We stuck into a similar error. I was trying to run the mediapipe llm example from Google.

Here is the error message: 29335 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null

  • Mediapipe v0.10.11
  • LLM model: _gemma-2b_ LLM model downloaded from Kaggle https://www.kaggle.com/models/google/gemma/frameworks/tfLite (no conversion)
  • Physical device Pixel 6 Android 14
  • Android studio 2023.2.1 Patch 1

@yuimo could you let us know which model did you download and replace?

jeffxchu avatar Apr 16 '24 21:04 jeffxchu

Which file are your downloading? https://www.kaggle.com/models/google/gemma/tfLite/gemma-2b-it-gpu-int4 should work on Android (among others).

schmidt-sebastian avatar Apr 19 '24 17:04 schmidt-sebastian

Hi @Vasanthengineer4949,

Could you kindly review the previous comment and provide us with the required information?

Thank you!!

kuaashish avatar Apr 22 '24 07:04 kuaashish

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

github-actions[bot] avatar Apr 30 '24 01:04 github-actions[bot]

This issue was closed due to lack of activity after being marked stale for past 7 days.

github-actions[bot] avatar May 07 '24 01:05 github-actions[bot]

Are you satisfied with the resolution of your issue? Yes No

google-ml-butler[bot] avatar May 07 '24 01:05 google-ml-butler[bot]

We stuck into a similar error. I was trying to run the mediapipe llm example from Google.

Here is the error message: 29335 llm_inference_engine.cc:92] Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null

  • Mediapipe v0.10.11
  • LLM model: _gemma-2b_ LLM model downloaded from Kaggle https://www.kaggle.com/models/google/gemma/frameworks/tfLite (no conversion)
  • Physical device Pixel 6 Android 14
  • Android studio 2023.2.1 Patch 1

@yuimo could you let us know which model did you download and replace?

we use the official model provided by google --- https://www.kaggle.com/models/google/gemma/tfLite/gemma-2b-it-gpu-int4 our observation: redownload and replace the model, then it works. but after a few days or a few hours, it will not work again. Sometimes even if we replace the original model, it still doesn't work. Strange phenomenon, looking forward to your reply

yuimo avatar May 13 '24 09:05 yuimo

same problem here, using kaggle TFLite bins. could it be an Android FS Restriction?

Fr1z avatar May 27 '24 19:05 Fr1z

I figured out selecting the model with rememberLauncherForActivityResult(contract = ActivityResultContracts.OpenDocument()).. and then copying with FileOutputStream to my application data directory gived by context.applicationContext.filesDir

On my android 13 the model is loaded without this error only from this location. hope it helps

Fr1z avatar May 29 '24 09:05 Fr1z