mediapipe
mediapipe copied to clipboard
universal-sentence-encoder-multilingual can not run in TextEmbedder
Have I written custom code (as opposed to using a stock example script provided in MediaPipe)
None
OS Platform and Distribution
Android 13
MediaPipe Tasks SDK version
com.google.mediapipe:tasks-text:0.20230731
Task name (e.g. Image classification, Gesture recognition etc.)
Text Embedder
Programming Language and version (e.g. C++, Python, Java)
Java
Describe the actual behavior
Error
Describe the expected behaviour
No error
Standalone code/steps you may have used to try to get what you need
I convert the universal-sentence-encoder-multilingual model(https://tfhub.dev/google/universal-sentence-encoder-multilingual/3) to tflite, and run it with the TextEmbedder API, the the below error appears:
Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions:https://www.tensorflow.org/lite/guide/ops_select
E Node number 0 (FlexSentencepieceOp) failed to prepare.
E Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
E Node number 0 (FlexSentencepieceOp) failed to prepare.
E E20231103 15:11:31.223073 9368 calculator_graph.cc:853] INTERNAL: CalculatorGraph::Run() failed in Run: Calculator::Open() for node "mediapipe_tasks_text_text_embedder_textembeddergraph__mediapipe_tasks_core_inferencesubgraph__inferencecalculator__mediapipe_tasks_text_text_embedder_textembeddergraph__mediapipe_tasks_core_inferencesubgraph__InferenceCalculator" failed: ; RET_CHECK failure (mediapipe/calculators/tensor/inference_interpreter_delegate_runner.cc:227) (interpreter->AllocateTensors())==(kTfLiteOk)
E Text embedder failed to load model with error: internal: CalculatorGraph::Run() failed in Run: Calculator::Open() for node "mediapipe_tasks_text_text_embedder_textembeddergraph__mediapipe_tasks_core_inferencesubgraph__inferencecalculator__mediapipe_tasks_text_text_embedder_textembeddergraph__mediapipe_tasks_core_inferencesubgraph__InferenceCalculator" failed: ; RET_CHECK failure (mediapipe/calculators/tensor/inference_interpreter_delegate_runner.cc:227) (interpreter->AllocateTensors())==(kTfLiteOk)
Does the TextEmbedder API support the universal-sentence-encoder-multilingual?
Other info / Complete Logs
No response
same aproach and same error here. please support multilangual models