continue
continue copied to clipboard
[CON-234] Autocomplete only generates '}' in unsaved files
Before submitting your bug report
- [X] I believe this is a bug. I'll try to join the Continue Discord for questions
- [X] I'm not able to find an open issue that reports the same bug
- [X] I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS:*
- Continue:0.9.124
- IDE:vscode
Description
Completions only show '}', regardless of what is in the file
This is local workspace There is no output for "Continue - LLM Prompts/Completions" when the file is not saved. If I save the file, I see prompts and autocompletes
Probably an issue reading unsaved files (in ideUtils.ts readFile)
To reproduce
This error is reproducible when I create a new file (selected language as python from status bar) and this file is not yet saved.
- Make new file
- Type some stuff
- The first completion might be fine, but subsequent are only '}'
Log output
No response
I am facing the same issue.
I am facing the same issue.
I cannot confirm this issue (it is an old issue from May 3). I tried the steps mentioned to reproduce:
- fresh unsaved file (turned off autosave etc)
- started typing in that file
- autocomplete worked as expected, just pressed
taba few times to generate following code
VSCode "About":
Version: 1.92.1
Commit: eaa41d57266683296de7d118f574d0c2652e1fc4
Date: 2024-08-07T20:16:39.455Z
Electron: 30.1.2
ElectronBuildId: 9870757
Chromium: 124.0.6367.243
Node.js: 20.14.0
V8: 12.4.254.20-electron.0
OS: Darwin arm64 23.5.0
Pre-Release Continue (0.9.195) with Codestral via Mistral API
It still happens. The only difference I see between us is OS and the Ollama local instance instead of API.
VS Code "About"
1.92.1 (user setup) Commit: eaa41d57266683296de7d118f574d0c2652e1fc4 Date: 2024-08-07T20:16:39.455Z Electron: 30.1.2 ElectronBuildId: 9870757 Chromium: 124.0.6367.243 Node.js: 20.14.0 V8: 12.4.254.20-electron.0 OS: Windows_NT x64 10.0.26120
Continue - v0.9.192 (pre-release) with Phi3-mini via Ollama
It still happens. The only difference I see between us is OS and the Ollama local instance instead of API. [...] Continue - v0.9.192 (pre-release) with Phi3-mini via Ollama
phi-3 is great, but it may not work for autocomplete. See here for recommended (local) models -> https://docs.continue.dev/setup/select-model#autocomplete
[Update: I just checked https://huggingface.co/microsoft/Phi-3-mini-4k-instruct and that model does seem to have been trained on the necessary Fill-in-the-middle tokens, so it most likely won't work for anything but (chat) completion]
As an alternative you can try the still completely free Codestral API, see here -> https://docs.continue.dev/walkthroughs/set-up-codestral
If this problem still persist then, then I am willing to concede that this is a Continue problem 😃
Otherwise: Thank you for bringing this still open issue up again, but in general, please refrain from issue necromancy and putting a one sentence comment under them in the future.
Also, @sestinj please close this issue once it's confirmed that the original problem is no longer an issue (that was with VSCode directly IMHO and seems to have been fixed in the meantime, as my short test above shows).
-
I agree. The intention is to use an internal model through the plugin. Since I encountered this issue, I tested out other open source models to make sure the issue wasn't with the model's response.
-
The above reason is also why I did not prefer using the Codestral API.
-
Apology for not providing more information initially. Wasn't sure how active the thread was.
-
The problem persists in the recommended model as well.
https://github.com/user-attachments/assets/7852bbbc-e489-470b-b638-95a1dff550aa
Thank you for your reply and I apologize if I sounded rude in any way, that is not my intention.
But I still think this is a case of a dumb model. phi-3 is not designed for autocomplete, and deepseek-1.3B is the tiniest of the models. I think I have shown that this does not happen with Codestral via API. Is there any way you can test a bigger model (6B or even the 16B one)? Or an API, just for testing?
So I have looked a bit in Continue's source code and I cannot see a reason where this } might come from, I am pretty sure it gets sent from the model. There is some post-processing happening here -> https://github.com/continuedev/continue/blob/main/core/autocomplete/postprocessing.ts but nowhere gets a } set.
Also I noticed you are running 0.9.192, which is a bit outdated. Can you please update to either 0.8.45 (stable) or 0.9.195 (Pre-Release), so we have a common baseline?
No worries. I tried out all your suggestions.
- [x] Switched to 0.8.45 (stable version)
- [x] Switched to 0.9.195 (Pre-Release)
- [x] Tested Codestral API
- [x] Tested a bigger model - Starcoder2 7B
Additionally,
- [x] Reopened the IDE for each changes to make sure there wasn't some residual settings + Verified configurations
Here are my observations,
-
The closing braces issue doesn't happen in both versions when I use Codestral API. * Works without any issues in 0.8.45 (stable version) * Hallucinates the
import java.util.Scannerline. 0.9.195 (Pre-Release) -
The closing braces issue does happen in the bigger model via Ollama for both versions.
-
Providing the same input via ollama cmd provides me proper responses.