vscode-copilot-release icon indicating copy to clipboard operation
vscode-copilot-release copied to clipboard

Can't use it offline even with Ollama provider

Open Noriller opened this issue 8 months ago • 10 comments

It seems that even if I set everything up once, it won't let me use Copilot Offline afterwards with a local Ollama instance running.

Be it offline or behind some proxy, considering using a local provider like Ollama, I would expect for the extension to work even without internet connections.

Noriller avatar Apr 14 '25 18:04 Noriller

running windows? ollama works fine in win11 running locally

models matter, streaming / non streaming. most models wont have edit or agent

this should all work with the default installation ( for win11 ) your using the copilot service for custom models, doubt this will work offline ever because too many pirates ( im building a webui for ollama gemini and openai )

MyNuggets666 avatar Apr 14 '25 21:04 MyNuggets666

It runs fine when setting everything up. But the problem is that it seems to need a continuous internet connection afterward.

Considering I'm running Ollama locally, it's weird it stops working without internet connection.

From the extension logs, maybe its some telemetry or similar thing throwing and making it impossible to use.

Noriller avatar Apr 14 '25 23:04 Noriller

Thanks for opening this issue. The reason is that we do not use ollama (or any other BYOK model) for all LLM calls right now. For example we do intent detection to see how to best handle a request in chat. Those still use the Copilot Service.

We simply did not do this yet. And we might do it in the future, based on user feedback. Thus leaving this issue open to gather more insights.

isidorn avatar Apr 17 '25 08:04 isidorn

According to the official docs, some API calls are still made to github copilot API, for embedding etc. Switching everything fully local would mean they would have to support all inference engine's custom embedding endpoints etc.

underlines avatar Apr 29 '25 16:04 underlines

+1 on offline support :)

cedricvidal avatar Jun 17 '25 17:06 cedricvidal

+1 for offline support

mepowerleo10 avatar Jun 21 '25 06:06 mepowerleo10

+1 to offline support

even bigger +1 to making the content filter calls to Azure optional!

lachiewalker avatar Jun 25 '25 02:06 lachiewalker

Another +1 for offline support. This would be really helpful in restricted environments.

infinitybiscuit avatar Jun 29 '25 21:06 infinitybiscuit

+1 to offline support

In the meantime, you can try this project, which is a proxy that simulates a connection, but I have no idea if it still works. https://github.com/danielgross/localpilot

EnzoDeg40 avatar Jul 31 '25 09:07 EnzoDeg40

+1 for full offline support

mighele avatar Nov 05 '25 11:11 mighele

+1 for offline support

mrGrochowski avatar Nov 18 '25 13:11 mrGrochowski