Ollama code completions has problem
What happened?
when choose deepseek-coder-v2:latest model,code completion works well,but choose deepseek-r1:7b model it completion ```,not useful code
Relevant log output or stack trace
Steps to reproduce
No response
CodeGPT version
2.16.3-241.1
Operating System
macOS
I am having this problem with codellama:7b and codellama:13b-code (as well as other models) using code completion with Ollama and CodeGPT (ProxyAI 2.16.4-241.1) and Ollama server (local network, port forwarded locally) version 0.5.7. This is a screenshot of the error showing up in Webstorm version 2024.3.4. I'm on macos 15.3.1.
{"error":"registry.ollama.ai/library/codellama:13b does not support insert"}
Stacktrace
java.lang.RuntimeException
at ee.carlrobert.llm.completion.CompletionEventSourceListener.onFailure(CompletionEventSourceListener.java:118)
at okhttp3.internal.sse.RealEventSource.processResponse(RealEventSource.kt:52)
at okhttp3.internal.sse.RealEventSource.onResponse(RealEventSource.kt:46)
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
at java.base/java.lang.Thread.run(Thread.java:1583)
I should mention that the chat function is working fine with the same models.
Could be related to issue #799
Update on this
It appears that if I change the plugin settings and check the box "Use build-in FIM template" it works as expected.