continue icon indicating copy to clipboard operation
continue copied to clipboard

timeout request from devcontainer

Open bwdmr opened this issue 1 year ago • 3 comments

Before submitting your bug report

Relevant environment info

- OS: macos
- Continue: latest
- IDE: vscode
- Model: codestral
- config.json:
  
{
  "models": [
    {
      "title": "codestral",
      "model": "codestral",
      "contextLength": 32000,
      "apiBase": "http://host.lima.internal:11434",
      "provider": "ollama"
    }
  ],
  "customCommands": [
    {
      "name": "test",
      "prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
      "description": "Write unit tests for highlighted code"
    }
  ],
  "tabAutocompleteModel": {
    "title": "codestral",
    "model": "codestral",
    "apiBase": "http://host.lima.internal:11434",
    "provider": "ollama"
  },
  "contextProviders": [
    {
      "name": "code",
      "params": {}
    },
    {
      "name": "docs",
      "params": {}
    },
    {
      "name": "diff",
      "params": {}
    },
    {
      "name": "terminal",
      "params": {}
    },
    {
      "name": "problems",
      "params": {}
    },
    {
      "name": "folder",
      "params": {}
    },
    {
      "name": "codebase",
      "params": {}
    }
  ],
  "slashCommands": [
    {
      "name": "edit",
      "description": "Edit selected code"
    },
    {
      "name": "comment",
      "description": "Write comments for the selected code"
    },
    {
      "name": "share",
      "description": "Export the current chat session to markdown"
    },
    {
      "name": "cmd",
      "description": "Generate a shell command"
    },
    {
      "name": "commit",
      "description": "Generate a git commit message"
    }
  ],
  "embeddingsProvider": {
    "provider": "ollama",
    "model": "nomic-embed-text"
  }
}

Description

hi i try to use continue.dev and vscode running ollama locally, but fail to connect to the api/chat endpoint. i manage to curl it from the terminal within the devcontainer

curl http://host.docker.internal:11434/api/chat -d '{
  "model": "codestral",
  "messages": [
    {
      "role": "user",
      "content": "why is the sky blue?"
    }
  ]

and

adjusted the config with the apiBase directive as mentioned in the docs:

  "models": [
    {
      "title": "codestral",
      "model": "codestral",
      "contextLength": 32000,
      "apiBase": "http://host.docker.internal:11434",
      "provider": "ollama"
    }
  ],

yet fail to access the service from the continue.dev client as i receive the following error message

request to http://host.docker.internal:11434/api/chat failed, reason: getaddrinfo ENOTFOUND host.docker.internal

To reproduce

cant run ollama in docker due to gpu acel cant access host network from rancher desktop except using default setting host.docker.internal client disregards address cant find server

Log output

No response

bwdmr avatar Aug 04 '24 17:08 bwdmr