continue icon indicating copy to clipboard operation
continue copied to clipboard

bug

Open miguelvaldez opened this issue 5 months ago • 3 comments

Before submitting your bug report

Relevant environment info

- OS:
- Continue version:
- IDE version:
- Model:
- config:
  

  
  OR link to assistant in Continue hub:

Description

Not enough context available to include the system message, last user message, and tools. There must be at least 1000 tokens remaining for output. Request had the following token counts: - contextLength: 8192 - counting safety buffer: 163.84 - tools: ~806 - system message: ~285 - last user or tool + tool call message tokens: ~10678 - max output tokens: 4096

To reproduce

No response

Log output


miguelvaldez avatar Jun 12 '25 19:06 miguelvaldez

Did you try increasing the values in your config.yaml under the model you're using?

Find the term"defaultCompletionOptions" in this and reference it for more info: https://docs.continue.dev/reference#models

My suggestion: defaultCompletionOptions: contextLength: <value greater than 8192> maxTokens: <value greater than 4096>

fyun89 avatar Jun 13 '25 01:06 fyun89

@miguelvaldez can you share your model config?

RomneyDa avatar Jun 13 '25 01:06 RomneyDa

I am using openwebui as an openai endpoint:

My config is:

  - name: devstral:24b
    provider: openai
    model: devstral:24b
    env:
      useLegacyCompletionsEndpoint: false
    apiBase: https://localai.XXXX/api
    apiKey: 00OO0xxxx
    roles:
      - chat
      - edit
      - apply
    capabilities:
      - tool_use
Not enough context available to include the system message, last user message, and tools.
      There must be at least 1000 tokens remaining for output.
      Request had the following token counts:
      - contextLength: 8192
      - counting safety buffer: 163.84
      - tools: ~6098
      - system message: ~345
      - last user or tool + tool call message tokens: ~1695
      - max output tokens: 4096

If I set the completion options to this it works:

    defaultCompletionOptions:
      maxTokens: 100000
      contextLength: 100000

What is strange is ollama on the backside is using the correct context size of 40960 that is set in openwebui. No matter what I set the numbers to the backend is unchanged.

geiseri avatar Jun 14 '25 04:06 geiseri

What is strange is ollama on the backside is using the correct context size of 40960 that is set in openwebui. No matter what I set the numbers to the backend is unchanged.

The changes you make here only affect your extension and are used to estimate how much context Continue can provide to the model etc. It would not affect your backend.

tomasz-stefaniak avatar Jul 11 '25 21:07 tomasz-stefaniak

@tomasz-stefaniak num_ctx is passed to Ollama

RomneyDa avatar Aug 12 '25 02:08 RomneyDa