CodeGPT icon indicating copy to clipboard operation
CodeGPT copied to clipboard

Ollama code completions has problem

Open fashen97 opened this issue 10 months ago • 2 comments

What happened?

when choose deepseek-coder-v2:latest model,code completion works well,but choose deepseek-r1:7b model it completion ```,not useful code

Relevant log output or stack trace


Steps to reproduce

No response

CodeGPT version

2.16.3-241.1

Operating System

macOS

fashen97 avatar Feb 20 '25 03:02 fashen97

I am having this problem with codellama:7b and codellama:13b-code (as well as other models) using code completion with Ollama and CodeGPT (ProxyAI 2.16.4-241.1) and Ollama server (local network, port forwarded locally) version 0.5.7. This is a screenshot of the error showing up in Webstorm version 2024.3.4. I'm on macos 15.3.1.

Image

{"error":"registry.ollama.ai/library/codellama:13b does not support insert"}

Stacktrace

java.lang.RuntimeException
	at ee.carlrobert.llm.completion.CompletionEventSourceListener.onFailure(CompletionEventSourceListener.java:118)
	at okhttp3.internal.sse.RealEventSource.processResponse(RealEventSource.kt:52)
	at okhttp3.internal.sse.RealEventSource.onResponse(RealEventSource.kt:46)
	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1583)

I should mention that the chat function is working fine with the same models.

Could be related to issue #799

cavebatsofware avatar Feb 28 '25 01:02 cavebatsofware

Update on this

It appears that if I change the plugin settings and check the box "Use build-in FIM template" it works as expected.

This works for me:

Image

This does not work for me right now:

Image

cavebatsofware avatar Feb 28 '25 01:02 cavebatsofware