mediapipe icon indicating copy to clipboard operation
mediapipe copied to clipboard

Use local model file but have an error "Not allowed to load local resource: file:///D:/model/gemma-2b-it-gpu-int4.bin"

Open akau16 opened this issue 1 year ago • 2 comments

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

None

OS Platform and Distribution

Firebase Hosting

MediaPipe Tasks SDK version

No response

Task name (e.g. Image classification, Gesture recognition etc.)

/llm_inference /js/

Programming Language and version (e.g. C++, Python, Java)

html, javascript

Describe the actual behavior

can not access local model file

Describe the expected behaviour

can access model file

Standalone code/steps you may have used to try to get what you need

When I run the llm_inference in localhost, it's ok to access model file like "gemma-2b-it-gpu-int4.bin" that is in project folder, but when I run llm_inference in Firebase Hosting, it can not access on-device's model file, it will show "Not allowed to load local resource: file:///D:/model/gemma-2b-it-gpu-int4.bin".
And I query it to get that info 'In standard HTML and JavaScript, it is not possible to directly specify to read files with a specific path on the local machine. This is due to browser security restrictions designed to protect user privacy and prevent malicious websites from automatically accessing the local file system'.

But I try your sample in MediaPipe Studio(https://mediapipe-studio.webapps.google.com/studio/demo/llm_inference), I can click 'Choose a model file' and select model file in my device and run OK, I would like to ask how can it do that? Thank you!

Other info / Complete Logs

No response

akau16 avatar Aug 30 '24 08:08 akau16

Hi @akau16,

Could you please review the Stack Overflow thread https://stackoverflow.com/questions/5074680/chrome-safari-errornot-allowed-to-load-local-resource-file-d-css-style and try the suggested solution? Let us know if you still need further assistance.

Thank you!!

kuaashish avatar Aug 30 '24 09:08 kuaashish

Hi kuaashish:

Thanks for your kindly reply, I think my problem is a little different with it. Above is my code.

LlmInference .createFromOptions(genaiFileset, { baseOptions: {modelAssetPath:'D:/model/gemma-2b-it-gpu-int4.bin'}, .....

and it will show the error Not allowed to load local resource: file:///D:/model/gemma-2b-it-gpu-int4.bin", how can I resolve the problem? thank you

akau16 avatar Aug 31 '24 16:08 akau16

In order to be safe, browsers do not allow webpages to freely access the user's file system. So URLs with file:/// or D:/ will cause an error in most situations. But that just means you need to grab the model from a different place or in a "safer" way. For an example of each:

  • You can use a file chooser (like in MediaPipe Studio), since browsers will allow the user to select a model from their computer (they just won't allow the webpage to do this for the user)
  • You can stage the model along with your demo (or at some other URL), so that would get rid of the file:/// or D:/ in your URL

tyrmullen avatar Oct 28 '25 20:10 tyrmullen