continue
continue copied to clipboard
bug
Before submitting your bug report
- [ ] I believe this is a bug. I'll try to join the Continue Discord for questions
- [ ] I'm not able to find an open issue that reports the same bug
- [ ] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:
- Continue version:
- IDE version:
- Model:
- config:
OR link to assistant in Continue hub:
Description
Not enough context available to include the system message, last user message, and tools. There must be at least 1000 tokens remaining for output. Request had the following token counts: - contextLength: 8192 - counting safety buffer: 163.84 - tools: ~806 - system message: ~285 - last user or tool + tool call message tokens: ~10678 - max output tokens: 4096
To reproduce
No response
Log output
Did you try increasing the values in your config.yaml under the model you're using?
Find the term"defaultCompletionOptions" in this and reference it for more info: https://docs.continue.dev/reference#models
My suggestion: defaultCompletionOptions: contextLength: <value greater than 8192> maxTokens: <value greater than 4096>
@miguelvaldez can you share your model config?
I am using openwebui as an openai endpoint:
My config is:
- name: devstral:24b
provider: openai
model: devstral:24b
env:
useLegacyCompletionsEndpoint: false
apiBase: https://localai.XXXX/api
apiKey: 00OO0xxxx
roles:
- chat
- edit
- apply
capabilities:
- tool_use
Not enough context available to include the system message, last user message, and tools.
There must be at least 1000 tokens remaining for output.
Request had the following token counts:
- contextLength: 8192
- counting safety buffer: 163.84
- tools: ~6098
- system message: ~345
- last user or tool + tool call message tokens: ~1695
- max output tokens: 4096
If I set the completion options to this it works:
defaultCompletionOptions:
maxTokens: 100000
contextLength: 100000
What is strange is ollama on the backside is using the correct context size of 40960 that is set in openwebui. No matter what I set the numbers to the backend is unchanged.
What is strange is ollama on the backside is using the correct context size of 40960 that is set in openwebui. No matter what I set the numbers to the backend is unchanged.
The changes you make here only affect your extension and are used to estimate how much context Continue can provide to the model etc. It would not affect your backend.
@tomasz-stefaniak num_ctx is passed to Ollama