continue
continue copied to clipboard
Custom LLM Provider works only on first message and hangs on subsequent messages
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: Windows 11
- Continue: v0.039, v0.041
- IDE: PyCharm Community Edition 2023.3.3, IntelliJ iDealC-2023.3.4.win
- Model: Custom model
Description
Main Issue
When defining a custom LLM Provider according to the docs (https://continue.dev/docs/model-setup/configuration#defining-a-custom-llm-provider) and saving the config.ts file, in PyCharm or IntelliJ IDE, the Continue ext only works once on the first message. An SQLITE_CONSTRAINT:NOT NULL error message is revealed after the first message as seen:
After which, subsequent messaged just hangs:
I tried both v0.039 and v0.041 of Continue ext and faced the same issue. This issue however is not present in VsCode and the same config.ts file works fine there.
Side issue:
I believe the documentation here (https://continue.dev/docs/model-setup/configuration#defining-a-custom-llm-provider) is outdated as well. This code throws an error:
export function modifyConfig(config: Config): Config {
config.models.push({
options: {
title: "My Custom LLM",
model: "mistral-7b",
},
streamComplete: async function* (prompt, options) {
// Make the API call here
// Then yield each part of the completion as it is streamed
// This is a toy example that will count to 10
for (let i = 0; i < 10; i++) {
yield `- ${i}\n`;
await new Promise((resolve) => setTimeout(resolve, 1000));
}
},
});
}
This is the working one that I used to replicate the above errors in the IDE:
export function modifyConfig(config: Config): Config {
config.models.push({
options: {
title: "MyModel",
model: "customModel",
},
streamCompletion: async function* (prompt, options) {
// Make the API call here
// Then yield each part of the completion as it is streamed
// This is a toy example that will count to 10
for (let i = 0; i < 5; i++) {
yield `- ${i}\n`;
await new Promise((resolve) => setTimeout(resolve, 100));
}
},
});
return config
}
To reproduce
- Change contents of
config.ts
file to:
export function modifyConfig(config: Config): Config {
config.models.push({
options: {
title: "MyModel",
model: "customModel",
},
streamCompletion: async function* (prompt, options) {
// Make the API call here
// Then yield each part of the completion as it is streamed
// This is a toy example that will count to 10
for (let i = 0; i < 5; i++) {
yield `- ${i}\n`;
await new Promise((resolve) => setTimeout(resolve, 100));
}
},
});
return config
}
- Launch IDE (either PyCharm or IntelliJ)
- Select MyModel at the bottom of Continue Extension
- Send a message to MyModel. It will count from 0 to 4. SQL error should appear:
- Send second message to MyModel. It will hang
Log output
[info] Starting Continue core...
[info] Starting Continue core...
[info] Exiting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Exiting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Exiting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
[info] Exiting Continue core...
[info] Starting Continue core...
[info] Starting Continue core...
+1
@foongzy @timdahlmanns thanks for sharing all of the details, that made the fix quite straightforward! It looks like we just weren't setting providerName in the CustomLLM class: https://github.com/continuedev/continue/commit/c3d3980e97af39ef75a8112963a18644ec807a69
I'm finishing up JetBrains 0.0.46 but the store may not accept it until Monday
On the mention of the docs fix: are you seeing red underlines in config.ts, or is it also entirely failing to load before you remove those type annotations? It looks more like this may be a mistake in how we're shipping the types file than the documentation
@sestinj thanks for looking into it. The docs fix I think it's just a typo between streamComplete and streamCompletion. The file fails to load if using streamComplete and it will hint to use streamCompletion instead
@foongzy @timdahlmanns thanks for sharing all of the details, that made the fix quite straightforward! It looks like we just weren't setting providerName in the CustomLLM class: c3d3980
I'm finishing up JetBrains 0.0.46 but the store may not accept it until Monday
Thanks for the quick fix, but I still get the SQLITE_CONSTRAINT error in the 0.0.46 version.
@foongzy @timdahlmanns thanks for sharing all of the details, that made the fix quite straightforward! It looks like we just weren't setting providerName in the CustomLLM class: c3d3980 I'm finishing up JetBrains 0.0.46 but the store may not accept it until Monday
Thanks for the quick fix, but I still get the SQLITE_CONSTRAINT error in the 0.0.46 version.
@sestinj the SQLITE_CONSTRAINT error still exists for me too
Hi @sestinj, any updates regarding this bug?
The error is fixed for me in the latest version.
The error is fixed for me in the latest version.
ahhh yes, thanks @timdah!