react-native-llm-mediapipe
react-native-llm-mediapipe copied to clipboard
Run an LLM on iOS & Android devices using React Native
hi getting this error while using the package , i might be using it wrong, can you help me with ``` const llmInference = useLlmInference({ storageType: 'file', modelPath: './gemma-2b-it-gpu-int4.bin', });...
I managed with some effort, to spin up the project from example app, but after I install llm-mediapipe using npm on freshly generated project, I get an error after running...
I was quite impressed to find that this exists. Thank you a lot for this effort. I was wondering if you @cdiddy77 have an idea about whether a particular model,...
The app crashes when using modelPath after downloading the file from network. ``` const llmInference = useLlmInference({ storageType: 'file', modelPath: '/data/user/0/com.offlinellmpoc/files/gemma-2b-it-cpu-int4.bin', }); ``` or ``` const llmInference = useLlmInference({ storageType:...
Following your model setup guide fails once trying to install the requirement.txt given the error in the title. I tried it twice, failed both times. ( I reinstalled python to...
Hi, i have currently added .task model (Gemma3-1B-IT_multi-prefill-seq_q8_ekv1280.task) in assets folder for android. But, when i run a useLlmInference function like this, it crashes with SIGABRT error. Could anyone help...