continue
continue copied to clipboard
Maxtokens Err / Prompt cannot be empty
Before submitting your bug report
- [x] I believe this is a bug. I'll try to join the Continue Discord for questions
- [x] I'm not able to find an open issue that reports the same bug
- [x] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:
- Continue version:
- IDE version:
- Model:
- config:
name: Local Assistant
version: 1.0.0
schema: v1
models:
- name: qwen2p5-72b-instruct
provider: openai
model: gpt-4-qwen2p5-72b-instruct
apiBase: http://0.0.0.0:1234/QWEN2-5-Agent/v1
roles:
- chat
- edit
- apply
capabilities:
- tool_use
defaultCompletionOptions:
temperature: 0.6
contextLength: 96000
maxTokens: 8000
context:
- provider: code
- provider: docs
- provider: diff
- provider: terminal
- provider: problems
- provider: folder
- provider: codebase
params:
nRetrieve: 25
nFinal: 10
useReranking: true
Description
The model I deployed supports a maxtoken of 128k, but in the config.yaml file, I configured maxTokens as 8000. When I tried to edit it, I encountered an error message saying "Prompt cannot be empty." error log is: Malformed JSON sent from server: {"error": {"object": "error", "message": "Prompt cannot be empty", "type": "BadRequestError", "param": null, "code": 400}}
To reproduce
No response
Log output