continue
continue copied to clipboard
Error calling Ollama /api/show endpoint: Error: Error: HTTP 404 Not Found from http://127.0.0.1:11434/api/show
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [ ] I'm not able to find an open issue that reports the same bug
- [ ] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:ubuntu 22.04
- Continue:
- IDE:vscode
Description
when use ollama LLM , why this extension call http://127.0.0.1:11434/api/show, ollama not have this endpoint
error messages at dev tools: Error calling Ollama /api/show endpoint: Error: Error: HTTP 404 Not Found from http://127.0.0.1:11434/api/show
To reproduce
No response
Log output
No response
@Wilson-fan8877 Ollama does have this endpoint, at least in their documentation: https://github.com/ollama/ollama/blob/main/docs/api.md#show-model-information
Could you share what version of Ollama you are using?
One other way to test this would by try the following curl request:
curl http://localhost:11434/api/show -d '{
"name": "llama3"
}'
Has the problem been solved?
This is still a problem and thwarts onboarding for people using ollama.
@trinque @homeant Are you both on the latest version of Ollama? Can you share more details about your setup?
It's pretty clear from their docs that this endpoint should exist, so I'm left wondering what difference in environment might be causing this. Any extra info will help me solve the problem more quickly
I am facing the same issue in both vscode and intellij. It was working before a month or so, I came back to code after a month and it's broken with the mentioned error message. My tab autocomplete config:
"tabAutocompleteModel": {
"title": "codegemma:7b-code",
"provider": "ollama",
"model": "codegemma:7b-code"
},
"tabAutocompleteOptions": {
"useCopyBuffer": true,
"maxPromptTokens": 400,
"prefixPercentage": 0.5
}
Model config:
{
"title": "codegemma:7b-code",
"model": "codegemma:7b-code",
"contextLength": 2048,
"completionOptions": {},
"apiBase": "http://localhost:11434",
"provider": "ollama"
}
curl http://localhost:11434/api/show -d '{
"name": "codegemma:7b-code"
}'
shows information about the model successfully.
same for me
I finally figured out that this is happening when requesting details about a model that is in an outdated format. So unless you manually go through and delete older models, the response will be the same from Ollama. For this reason we'll be ignoring it in all future releases. Next one for JetBrains should be early this week and for VS Code it's already released in pre-release