continue
continue copied to clipboard
Custom commands need to be written twice.
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: windows 10
- Continue: 0.9.126
- IDE: vs code 1.86.2
- model: deepseek 33b
Description
I've written a few / commands like logs and ut, and all of the commands don't work and the model just describes the code. In the same chat, the first time I use the command, I receive a code explanation, and when I send it again it works. The same prompt and model work for jetbrains.
To reproduce
Here is the config:
{
"models": [
{
"title": "Deepseek",
"provider": "openai",
"model": "deepseek-33b-instruct:,
"apiBase": "http://my-api-base/v1",
"systemMessage": "you are an AI programming code-completion assistent, utilizing the deepseek coder model. you answer programming related questions.",
"apikey": "my-apikey",
"contextLength": 16000
}
]
"completeionOptions": {
"temperature": 0.3
}
"customCommands": [
{
"name": "ut",
"prompt": "{{{ input }}}\n\nAdd logs for thr selected code. Give the new code just as chat output, don't edit any file.",
"description": "Add logs to highlighted code"
}
]
}
Log output
No response
@eladamittai I notice that there is a colon rather than a quote in your model name ("model": "deepseek-33b-instruct:,
). Was this just a typo when copying over to GitHub issues?
Assuming that's not causing the config to not load, the first thing I would check is the prompt logs: https://docs.continue.dev/troubleshooting#llm-prompt-logs. This will tell us whether the instructions are being entirely left out, or if the problem is something else.
I also would recommend removing your system prompt. This is a detail particular to deepseek models, but they automatically have a system prompt very similar to yours, and using a different one might be causing it to act strange
@sestinj hey, thanks for the quick reply! 🙏 I removed the system prompt and checked the logs, and the entire command is being left out of the context. Instead of the command, it just puts /log before the context. I also tried again with just writing the command in the chat instead of using the slash command, and it worked fine. Same with the /edit command. So it's only for the custom commands.
Got it! I think this is related to a fix I made just yesterday. I believe it should be available in the latest pre-release, 0.9.130
@sestinj great! I'll check it out. Thanks!
It works!