continue icon indicating copy to clipboard operation
continue copied to clipboard

Configure ollama remotely

Open myrulezzz opened this issue 1 year ago • 3 comments

Before submitting your bug report

Relevant environment info

- OS: Mac
- Continue:
- IDE: Visual Studio Code

Description

Ollama is running on remote machine. i want to be able to configure ollama remote server. however i couldn't find such an option

To reproduce

No response

Log output

No response

myrulezzz avatar Apr 23 '24 21:04 myrulezzz

@myrulezzz You can change the apiBase property on the object in the "models" array to wherever you serve Ollama

https://continue.dev/docs/reference/config <- Here is the full reference for config.json where you can find such options. We'll do a better job in our docs of explaining this though. I understand that this should definitely be on the Ollama reference page

sestinj avatar Apr 24 '24 07:04 sestinj

Ok thanks.Also the prompt setup is a bit confusing.How to identify hiw to format prompt in the Modelfile?

Regards, Andreas Stylianou


From: Nate Sesti @.> Sent: Wednesday, April 24, 2024 10:57:42 AM To: continuedev/continue @.> Cc: Andreas @.>; Mention @.> Subject: Re: [continuedev/continue] Configure ollama remotely (Issue #1175)

@myrulezzzhttps://github.com/myrulezzz You can change the apiBase property on the object in the "models" array to wherever you serve Ollama

https://continue.dev/docs/reference/config <- Here is the full reference for config.json where you can find such options. We'll do a better job in our docs of explaining this though. I understand that this should definitely be on the Ollama reference page

— Reply to this email directly, view it on GitHubhttps://github.com/continuedev/continue/issues/1175#issuecomment-2074316708, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AKIY77NGLWGW3UNPPBL76WLY65QXNAVCNFSM6AAAAABGVXRSIOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANZUGMYTMNZQHA. You are receiving this because you were mentioned.Message ID: @.***>

myrulezzz avatar Apr 24 '24 08:04 myrulezzz

Your config should look something like this:

    {
      "model": "AUTODETECT",
      "title": "Ollama (Remote)",
      "completionOptions": {},
      "apiBase": "http://192.168.1.100:11434",
      "provider": "ollama"
    }

Just make sure on your server where you are running Ollama you have OLLAMA_HOME set to enable network access to it, otherwise you can't reach it. https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server

BowTiedCrocodile avatar Apr 28 '24 20:04 BowTiedCrocodile