continue icon indicating copy to clipboard operation
continue copied to clipboard

Custom LLM Provider works only on first message and hangs on subsequent messages

Open foongzy opened this issue 10 months ago • 6 comments

Before submitting your bug report

Relevant environment info

- OS: Windows 11
- Continue: v0.039, v0.041
- IDE: PyCharm Community Edition 2023.3.3, IntelliJ iDealC-2023.3.4.win
- Model: Custom model

Description

Main Issue

When defining a custom LLM Provider according to the docs (https://continue.dev/docs/model-setup/configuration#defining-a-custom-llm-provider) and saving the config.ts file, in PyCharm or IntelliJ IDE, the Continue ext only works once on the first message. An SQLITE_CONSTRAINT:NOT NULL error message is revealed after the first message as seen: error After which, subsequent messaged just hangs: hang I tried both v0.039 and v0.041 of Continue ext and faced the same issue. This issue however is not present in VsCode and the same config.ts file works fine there.

Side issue:

I believe the documentation here (https://continue.dev/docs/model-setup/configuration#defining-a-custom-llm-provider) is outdated as well. This code throws an error:

export function modifyConfig(config: Config): Config {
  config.models.push({
    options: {
      title: "My Custom LLM",
      model: "mistral-7b",
    },
    streamComplete: async function* (prompt, options) {
      // Make the API call here

      // Then yield each part of the completion as it is streamed
      // This is a toy example that will count to 10
      for (let i = 0; i < 10; i++) {
        yield `- ${i}\n`;
        await new Promise((resolve) => setTimeout(resolve, 1000));
      }
    },
  });
}

This is the working one that I used to replicate the above errors in the IDE:

export function modifyConfig(config: Config): Config {
  config.models.push({
    options: {
      title: "MyModel",
      model: "customModel",
    },
    streamCompletion: async function* (prompt, options) {
      // Make the API call here

      // Then yield each part of the completion as it is streamed
      // This is a toy example that will count to 10
      for (let i = 0; i < 5; i++) {
        yield `- ${i}\n`;
        await new Promise((resolve) => setTimeout(resolve, 100));
      }
    },
  });
  return config
}

To reproduce

  1. Change contents of config.ts file to:
export function modifyConfig(config: Config): Config {
  config.models.push({
    options: {
      title: "MyModel",
      model: "customModel",
    },
    streamCompletion: async function* (prompt, options) {
      // Make the API call here

      // Then yield each part of the completion as it is streamed
      // This is a toy example that will count to 10
      for (let i = 0; i < 5; i++) {
        yield `- ${i}\n`;
        await new Promise((resolve) => setTimeout(resolve, 100));
      }
    },
  });
  return config
}
  1. Launch IDE (either PyCharm or IntelliJ)
  2. Select MyModel at the bottom of Continue Extension
  3. Send a message to MyModel. It will count from 0 to 4. SQL error should appear: error
  4. Send second message to MyModel. It will hang

Log output

[info] Starting Continue core...
[info] Starting Continue core... 
[info] Exiting Continue core... 
[info] Starting Continue core...
[info] Starting Continue core... 
[info] Starting Continue core...
[info] Starting Continue core... 
[info] Exiting Continue core... 
[info] Starting Continue core...
[info] Starting Continue core... 
[info] Starting Continue core...
[info] Starting Continue core... 
[info] Exiting Continue core... 
[info] Starting Continue core...
[info] Starting Continue core... 
[info] Starting Continue core...
[info] Starting Continue core... 
[info] Exiting Continue core... 
[info] Starting Continue core...
[info] Starting Continue core...

foongzy avatar Apr 05 '24 04:04 foongzy

+1

ghost avatar Apr 24 '24 14:04 ghost

@foongzy @timdahlmanns thanks for sharing all of the details, that made the fix quite straightforward! It looks like we just weren't setting providerName in the CustomLLM class: https://github.com/continuedev/continue/commit/c3d3980e97af39ef75a8112963a18644ec807a69

I'm finishing up JetBrains 0.0.46 but the store may not accept it until Monday

sestinj avatar Apr 26 '24 20:04 sestinj

On the mention of the docs fix: are you seeing red underlines in config.ts, or is it also entirely failing to load before you remove those type annotations? It looks more like this may be a mistake in how we're shipping the types file than the documentation

sestinj avatar Apr 26 '24 20:04 sestinj

@sestinj thanks for looking into it. The docs fix I think it's just a typo between streamComplete and streamCompletion. The file fails to load if using streamComplete and it will hint to use streamCompletion instead

foongzy avatar Apr 26 '24 23:04 foongzy

@foongzy @timdahlmanns thanks for sharing all of the details, that made the fix quite straightforward! It looks like we just weren't setting providerName in the CustomLLM class: c3d3980

I'm finishing up JetBrains 0.0.46 but the store may not accept it until Monday

Thanks for the quick fix, but I still get the SQLITE_CONSTRAINT error in the 0.0.46 version.

timdah avatar Apr 29 '24 13:04 timdah

@foongzy @timdahlmanns thanks for sharing all of the details, that made the fix quite straightforward! It looks like we just weren't setting providerName in the CustomLLM class: c3d3980 I'm finishing up JetBrains 0.0.46 but the store may not accept it until Monday

Thanks for the quick fix, but I still get the SQLITE_CONSTRAINT error in the 0.0.46 version.

@sestinj the SQLITE_CONSTRAINT error still exists for me too

foongzy avatar Apr 30 '24 01:04 foongzy

Hi @sestinj, any updates regarding this bug?

foongzy avatar Jul 08 '24 02:07 foongzy

The error is fixed for me in the latest version.

timdah avatar Jul 08 '24 05:07 timdah

The error is fixed for me in the latest version.

ahhh yes, thanks @timdah!

foongzy avatar Jul 08 '24 07:07 foongzy