continue icon indicating copy to clipboard operation
continue copied to clipboard

VS Code Chat Ollama on local server ETIMEDOUT

Open LeonDpl opened this issue 10 months ago • 2 comments

Before submitting your bug report

Relevant environment info

- Continue extension details : 
Identifier : continue.continue
Version : 1.0.3
Last Updated : 2025-03-11, 17:43:47
- IDE : VS Code 1.98.0
- OS : Windows 11

Description

VS Code Continue extension on a computer + local ollama server accessed through VPN gives error using the chat but ollama POST is working

To reproduce

Hi,

  • Currently running a local ollama server on a private network. I installed VS Code Continue extension and want to use models from the ollama server (access through a VPN).

  • Trying to use the chat returns : request to http://[XXX.XX.X.XX]:11434/api/chat failed, reason: connect ETIMEDOUT [XXX.XX.X.XX]:11434

  • The error log is the following :

Error handling webview message: {
  "msg": {
    "messageId": "9c9bc2fc-fdb5-44fa-8d52-1e1d89adb569",
    "messageType": "llm/streamChat",
    "data": {
      "messages": [
        {
          "role": "system",
          "content": "..."
        },
        {
          "role": "user",
          "content": [
            {
              "type": "text",
              "text": "Say \"Hi!\""
            }
          ]
        },
        {
          "role": "assistant",
          "content": ""
        }
      ],
      "title": "deepseek-r1",
      "completionOptions": {}
    }
  }
}

Error: request to http://[XXX.XX.X.XX]:11434/api/chat failed, reason: connect ETIMEDOUT [XXX.XX.X.XX]:11434
  • Using Postman to check the POST request http://[XXX.XX.X.XX]:11434/api/chat works with the following payload (returning response 200 and a valid answer):
{
  "model": "deepseek-r1:70b",
  "messages": [
    {
      "role": "system",
      "content": "..." 
    },
    {
      "role": "user", 
      "content": "Answer with one single word. The sky colour is"
    }
  ],
  "stream": false
}
  • My config.yaml is :
name: continue_config_v0
version: 0.0.1
schema: v1

models:
  - name : deepseek-r1
    model: deepseek-r1:70b
    provider: ollama
    apiBase: http://[XXX.XX.X.XX]:11434/
    roles:
      - chat
    defaultCompletionOptions:
      temperature: 0.5
      maxTokens: 2000

rules:
  - Give concise responses
  • Continue extension details : Identifier : continue.continue Version : 1.0.3 Last Updated : 2025-03-11, 17:43:47

  • IDE : VS Code 1.98.0

  • OS : Windows 11

Log output

Error handling webview message: {
  "msg": {
    "messageId": "9c9bc2fc-fdb5-44fa-8d52-1e1d89adb569",
    "messageType": "llm/streamChat",
    "data": {
      "messages": [
        {
          "role": "system",
          "content": "..."
        },
        {
          "role": "user",
          "content": [
            {
              "type": "text",
              "text": "Say \"Hi!\""
            }
          ]
        },
        {
          "role": "assistant",
          "content": ""
        }
      ],
      "title": "deepseek-r1",
      "completionOptions": {}
    }
  }
}

Error: request to http://[XXX.XX.X.XX]:11434/api/chat failed, reason: connect ETIMEDOUT [XXX.XX.X.XX]:11434

LeonDpl avatar Mar 14 '25 13:03 LeonDpl

Is this related? https://github.com/continuedev/continue/issues/4174

tomasz-stefaniak avatar Mar 18 '25 18:03 tomasz-stefaniak

FYI tried with continue extension pre-release version today (1.1.14) and still got the issue

LeonDpl avatar Mar 24 '25 09:03 LeonDpl