Srikanth Srungarapu
Srikanth Srungarapu
Can you please provide more context on this? I tried looking at [API documentation](https://github.com/ollama/ollama/blob/main/docs/api.md) provided by Ollama and there is not such parameter related to user identification.
HI @doganaktarr , Sorry for the delay in response. The error message says "codellama" not found. Can you please confirm if it is available on your "ollama" instance?
From the attached image, it looks like the request was being sent to Ollama instance, but it didn't receive any response. When this happens, I usually restart my Ollama instance...
One way of checking whether Ollama is causing trouble or not is by copy-pasting the prompt from the VSCode logs and trying it on Ollama CLI.
Apologies for the delayed response. I tested it with large-size files from my end too. The inline suggestions are not displayed when the inference times are longer (~5 seconds or...
Quite possible. The key point per my understanding, is for the files you want to run Privy on, the inference time from your Ollama host shouldn't take more than a...