continue
                                
                                 continue copied to clipboard
                                
                                    continue copied to clipboard
                            
                            
                            
                        useLegacyCompletionsEndpoint option does not work for tabAutocompleteModel
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [x] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: macOS 14.2
- Continue: 0.8.42
- IDE: VSC 1.91.0
- Model: deepseek
- config.json:
  
{
  "useLegacyCompletionsEndpoint": false,
  "allowAnonymousTelemetry": false,
  "models": [
    {
      "title": "llama3",
      "provider": "openai",
      "model": "llama3",
      "apiKey": "local",
      "apiBase": "http://localhost/v1"
    }
  ],
  "tabAutocompleteModel": {
    "title": "deepseek",
    "provider": "openai",
    "model": "deepseek",
    "apiKey": "local",
    "apiBase": "http://localhost/v1"
  },
  "embeddingsProvider": {
    "provider": "free-trial"
  },
  "reranker": {
    "name": "free-trial"
  }
}
Description
I am using a local OpenAI compatible server that implements chat completions API. I disabled useLegacyCompletionsEndpoint in config.json. But when I try auto completion it still uses the completions endpoint.
To reproduce
No response
Log output
Logs from developer tool console
[Extension Host] HTTP 404 Not Found from http://127.0.0.1/v1/completions
{"detail":"Not Found"}
Code: undefined
Error number: undefined
Syscall: undefined
Type: undefined
Error: HTTP 404 Not Found from http://127.0.0.1/v1/completions
{"detail":"Not Found"}
    at customFetch (/Users/foo/.vscode/extensions/continue.continue-0.8.43-darwin-arm64/out/extension.js:102255:21)