continue
continue copied to clipboard
Configure ollama remotely
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Mac
- Continue:
- IDE: Visual Studio Code
Description
Ollama is running on remote machine. i want to be able to configure ollama remote server. however i couldn't find such an option
To reproduce
No response
Log output
No response
@myrulezzz You can change the apiBase property on the object in the "models" array to wherever you serve Ollama
https://continue.dev/docs/reference/config <- Here is the full reference for config.json where you can find such options. We'll do a better job in our docs of explaining this though. I understand that this should definitely be on the Ollama reference page
Ok thanks.Also the prompt setup is a bit confusing.How to identify hiw to format prompt in the Modelfile?
Regards, Andreas Stylianou
From: Nate Sesti @.> Sent: Wednesday, April 24, 2024 10:57:42 AM To: continuedev/continue @.> Cc: Andreas @.>; Mention @.> Subject: Re: [continuedev/continue] Configure ollama remotely (Issue #1175)
@myrulezzzhttps://github.com/myrulezzz You can change the apiBase property on the object in the "models" array to wherever you serve Ollama
https://continue.dev/docs/reference/config <- Here is the full reference for config.json where you can find such options. We'll do a better job in our docs of explaining this though. I understand that this should definitely be on the Ollama reference page
— Reply to this email directly, view it on GitHubhttps://github.com/continuedev/continue/issues/1175#issuecomment-2074316708, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AKIY77NGLWGW3UNPPBL76WLY65QXNAVCNFSM6AAAAABGVXRSIOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANZUGMYTMNZQHA. You are receiving this because you were mentioned.Message ID: @.***>
Your config should look something like this:
{
"model": "AUTODETECT",
"title": "Ollama (Remote)",
"completionOptions": {},
"apiBase": "http://192.168.1.100:11434",
"provider": "ollama"
}
Just make sure on your server where you are running Ollama you have OLLAMA_HOME set to enable network access to it, otherwise you can't reach it. https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server