tabby icon indicating copy to clipboard operation
tabby copied to clipboard

CodeQwen returns extra white space for code completion

Open ycclnn opened this issue 10 months ago • 4 comments

image

Other models like deepseek works without a problem.

ycclnn avatar Apr 24 '24 07:04 ycclnn

Thanks for reporting the issue - I have also observed the issue and am looking into the debug process. It seems that the tokenizer treats _ as a space, regardless of the context for CodeQwen.

wsxiaoys avatar Apr 24 '24 07:04 wsxiaoys

Can confirm this presents in upstream llama.cpp as well:

cross posted at: https://github.com/ggerganov/llama.cpp/issues/7050

wsxiaoys avatar May 02 '24 20:05 wsxiaoys

Can confirm this presents in upstream llama.cpp as well:

cross posted at: ggerganov/llama.cpp#7050

yeah, white space exists for using vllm as well so its a model thing rather than serving framework thing. I was thinking shiftleft the cursor a few chars and then check the completion result with overlapping substrings to overcome this and thats the only workaround in my mind.

ycclnn avatar May 06 '24 00:05 ycclnn

Observed that codeqwen not always returns extra white space, and sometimes the white space is meaningful. Therefore, trim the leading white space may not be an approach.

ycclnn avatar May 06 '24 00:05 ycclnn

We've confirmed this is actually an issue comes with Qwen's vocab - which should be fixed in next version CodeQwen release based on my communication with Qwen team.

wsxiaoys avatar Jun 11 '24 12:06 wsxiaoys