continue
continue copied to clipboard
Ollama with Codellama 7b not working
Before submitting your bug report
- [ ] I believe this is a bug. I'll try to join the Continue Discord for questions
- [ ] I'm not able to find an open issue that reports the same bug
- [ ] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:
MacOs
- Continue:
- IDE:
VS Code
Description
I tried to configure Configure Ollama with Codellama 7b but i keep on getting http://localhost:11434/v1/api/chat url not found. I have confirmed my ollama server is running
To reproduce
No response
Log output
No response
I was getting 404 errors from using Ollama 0.1.10, I upgraded to Ollama 0.1.28 and the errors messages were improved, giving a better 400 error response description. I suggest checking your ollama version is up to date.
@ajasingh MattyRad is correct that this will be resolved by upgrading your version of Ollama. I'm going to close the issue, but feel free to re-open or message me in Discord if you continue to see problems