Does Copilot support models that are locally hosted or proprietary but we have an API & API key?
Is your feature request related to a problem? Please describe. Given an API & API key, how would I setup copilot to use my own locally hosted or propriety model?
Describe the solution you'd like If this feature exists, a step by step setup procedure!
Describe alternatives you've considered N/A
Additional context N/A
In Copilot Settings just add correct custom model for chat and embedding. For both separately set model name and provider. API key might not be needed. At the tab "Basic" of Copilot Settings select proper default models for Chat and Embedding Make sure you can refresh the index Try to chat
I just did it myself for Ollama provider, Llama3.2 for chat model and Bge-m3(Ollama) for embeddings
Does not work for MSTY server running on http://localhost:10000 with or without CORS and with or without /v1/chat.
I get the following error: `without CORS Error: BodyStreamBuffer was aborted with CORS Error: BodyStreamBuffer was aborted
at e.ping (plugin:copilot:565:32)
at async q (plugin:copilot:805:69761)`
With CORS turned off, I get "404: Page Not Found"
MSTY's server is essentially an embedded Ollama server. You can download the Ollama server from Ollama's Website and replace the MSTY server binary directly, and it all works. So MSTY's server is simply Ollama running on port 10000 instead of the default Ollama port 11434.
But it won't work with CoPilot. It might work with genuine Ollama, I suppose, but that's not useful for me.
Also, I can't find any way to EDIT the model once it has been added! Which is ridiculous...
Does not work for MSTY server running on http://localhost:10000 with or without CORS and with or without /v1/chat.
I get the following error: `without CORS Error: BodyStreamBuffer was aborted with CORS Error: BodyStreamBuffer was aborted
at e.ping (plugin:copilot:565:32) at async q (plugin:copilot:805:69761)`With CORS turned off, I get "404: Page Not Found"
MSTY's server is essentially an embedded Ollama server. You can download the Ollama server from Ollama's Website and replace the MSTY server binary directly, and it all works. So MSTY's server is simply Ollama running on port 10000 instead of the default Ollama port 11434.
But it won't work with CoPilot. It might work with genuine Ollama, I suppose, but that's not useful for me.
Also, I can't find any way to EDIT the model once it has been added! Which is ridiculous...
It's working with Ollama. If you still have this issue, please reopen.