continue icon indicating copy to clipboard operation
continue copied to clipboard

Can't make it work with LMStudio

Open tthierryEra opened this issue 1 year ago • 1 comments

Before submitting your bug report

Relevant environment info

- OS: Windows
- Continue: v0.8.25
- IDE: VsCode

I'm using LMStudio with mistral or phi or llama3 whatever

Description

image

image

Hello,

When using LMStudio as result is the same, in the chatbot and in file I always get the system prompt back and all whatever I try on the prompt formating is not working.

Really need help here. It's working with ollama and external apis.

Thanks

To reproduce

No response

Log output

No response

tthierryEra avatar May 02 '24 20:05 tthierryEra

@tthierryEra can you share your config.json so I can better help debug?

sestinj avatar May 02 '24 21:05 sestinj

Hello Thank you

I did not changed anything else than adding models { "title": "LM Studio - Llama 8G", "provider": "lmstudio", "model": "lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF" }, { "title": "Crusoeai Llama 8B - 262k", "provider": "lmstudio", "model": "crusoeai/Llama-3-8B-Instruct-262k-GGUF" },

tthierryEra avatar May 03 '24 11:05 tthierryEra

@tthierryEra thanks. I think maybe the best place to check would be in the "Output" tab next to the VS Code terminal, and then in the dropdown on the right select "Continue - LLM Prompts/Completions". This will show the exact prompt send to the LLM. This in addition to the logs that you see on the side of LM Studio will likely show us what we need to see (it's definitely just a prompt formatting mistake).

The first possible solution I can think of is to double-check your prompt formatting settings on the LM Studio side. I believe you can edit these in the right side panel

sestinj avatar May 03 '24 17:05 sestinj

So from my testing, whenever I add a system prompt wheter it's directly in LMStudio or in continue.dev code like config.ts, that system prompt will be printed back to the chat.

If I leave everything empty it's "working" but really not useful as the llm does not know how to react/interact.

Exemple:

Output from continue

Settings:
contextLength: 4096
model: MaziyarPanahi/WizardLM-2-7B-GGUF
maxTokens: 1024

<user>
hello

Completion:


You are a useful coding assistant.hello! How can I assist you today?

As a coding assistant, I'm here to help with a wide range of programming-related questions and tasks. Whether you need assistance writing or understanding code, debugging issues, learning about best practices, or exploring new technologies, feel free to ask. What do you need help with today?

I tried to leave everything empty. Or add as the prefix for user message. Not working either :-(

Settings:
contextLength: 4096
model: MaziyarPanahi/WizardLM-2-7B-GGUF
maxTokens: 1024

############################################

<user>
hello. How's it going?

==========================================================================
==========================================================================
Completion:

You are a useful coding assistant.<user>hello. How's it going?</user>Hello there! I'm doing well, thank you. It's always a pleasure to assist with any questions or tasks you might have. How can I help you today?

Maybe I'm doing something wrong?

I tried this image

Or all empty image

Not working still have this as result image

tthierryEra avatar May 03 '24 18:05 tthierryEra

I just ran into this problem and fixed it on my end. The broken behavior happens when running LM Studio 0.2.22. Reverting back to the 0.2.21 version has fixed the prompting formatting.

xinnod avatar May 04 '24 02:05 xinnod

You closed the issue because the solution is to downgrade LMStudio? Or is there a fix in continue?

tthierryEra avatar May 06 '24 16:05 tthierryEra

I just ran into this problem and fixed it on my end. The broken behavior happens when running LM Studio 0.2.22. Reverting back to the 0.2.21 version has fixed the prompting formatting.

@xinnod Where did you found the old version? Thanks

tthierryEra avatar May 08 '24 14:05 tthierryEra