cline icon indicating copy to clipboard operation
cline copied to clipboard

Maximum Context Length Mismatch with OpenAI Compatible Models

Open yushaw opened this issue 1 month ago • 6 comments

What happened?

When using OpenAI compatible models (deepseek-v3), the API calls fail with context length errors

The error suggests the model is using incorrect context length limits.

I want to know how to set the correct length limit.

Steps to reproduce

  1. set the baseurl and key of deepseek, use the model "deepseek-chat"
  2. start to "cline" a few times
  3. encounter the problem

Relevant API REQUEST output

400 This model's maximum context length is 65536 tokens. However, you requested 67918 tokens (67918 in the messages, 0 in the completion). Please reduce the length of the messages or completion.

Operating System

macOS 15.2

Cline Version

v3.0.5

Additional context

No response

yushaw avatar Dec 27 '24 04:12 yushaw