react-native-llm-mediapipe
react-native-llm-mediapipe copied to clipboard
Which model is to be added?
Hi, i have currently added .task model (Gemma3-1B-IT_multi-prefill-seq_q8_ekv1280.task) in assets folder for android. But, when i run a useLlmInference function like this, it crashes with SIGABRT error. Could anyone help me out on adding a model so that i can run it successfully?
const { generateResponse} = useLlmInference({ modelPath: "Gemma3-1B-IT_multi-prefill-seq_q8_ekv1280.task", });
hey there, have u found any solutions for this?