cline
cline copied to clipboard
Maximum Context Length Mismatch with OpenAI Compatible Models
What happened?
When using OpenAI compatible models (deepseek-v3), the API calls fail with context length errors
The error suggests the model is using incorrect context length limits.
I want to know how to set the correct length limit.
Steps to reproduce
- set the baseurl and key of deepseek, use the model "deepseek-chat"
- start to "cline" a few times
- encounter the problem
Relevant API REQUEST output
400 This model's maximum context length is 65536 tokens. However, you requested 67918 tokens (67918 in the messages, 0 in the completion). Please reduce the length of the messages or completion.
Operating System
macOS 15.2
Cline Version
v3.0.5
Additional context
No response