obsidian-copilot icon indicating copy to clipboard operation
obsidian-copilot copied to clipboard

Does Copilot support models that are locally hosted or proprietary but we have an API & API key?

Open pengaustin opened this issue 9 months ago • 2 comments

Is your feature request related to a problem? Please describe. Given an API & API key, how would I setup copilot to use my own locally hosted or propriety model?

Describe the solution you'd like If this feature exists, a step by step setup procedure!

Describe alternatives you've considered N/A

Additional context N/A

pengaustin avatar Mar 20 '25 17:03 pengaustin

In Copilot Settings just add correct custom model for chat and embedding. For both separately set model name and provider. API key might not be needed. At the tab "Basic" of Copilot Settings select proper default models for Chat and Embedding Make sure you can refresh the index Try to chat

I just did it myself for Ollama provider, Llama3.2 for chat model and Bge-m3(Ollama) for embeddings

beeduino avatar Mar 23 '25 05:03 beeduino

Does not work for MSTY server running on http://localhost:10000 with or without CORS and with or without /v1/chat.

I get the following error: `without CORS Error: BodyStreamBuffer was aborted with CORS Error: BodyStreamBuffer was aborted

at e.ping (plugin:copilot:565:32)
at async q (plugin:copilot:805:69761)`

With CORS turned off, I get "404: Page Not Found"

MSTY's server is essentially an embedded Ollama server. You can download the Ollama server from Ollama's Website and replace the MSTY server binary directly, and it all works. So MSTY's server is simply Ollama running on port 10000 instead of the default Ollama port 11434.

But it won't work with CoPilot. It might work with genuine Ollama, I suppose, but that's not useful for me.

Also, I can't find any way to EDIT the model once it has been added! Which is ridiculous...

richardstevenhack avatar Apr 07 '25 23:04 richardstevenhack

Does not work for MSTY server running on http://localhost:10000 with or without CORS and with or without /v1/chat.

I get the following error: `without CORS Error: BodyStreamBuffer was aborted with CORS Error: BodyStreamBuffer was aborted

at e.ping (plugin:copilot:565:32)
at async q (plugin:copilot:805:69761)`

With CORS turned off, I get "404: Page Not Found"

MSTY's server is essentially an embedded Ollama server. You can download the Ollama server from Ollama's Website and replace the MSTY server binary directly, and it all works. So MSTY's server is simply Ollama running on port 10000 instead of the default Ollama port 11434.

But it won't work with CoPilot. It might work with genuine Ollama, I suppose, but that's not useful for me.

Also, I can't find any way to EDIT the model once it has been added! Which is ridiculous...

It's working with Ollama. If you still have this issue, please reopen.

ichts avatar Oct 27 '25 15:10 ichts