Debounce, LlamaCpp support, expose prompt as setup option, fix passing parameters to model (ollama)
Hi, This pr fixes https://github.com/tzachar/cmp-ai/issues/8
And gives u ability to customize prompt, and adds debounce delay:
debounce_delay - request will be send after x ms, after last key press. And added support for Llama Server
+1 for fixing the ollama params, without this, you can't change it to a remote host.
@JoseConseco Any updates?
I will have to google, how do I split this into multiple pr - with one file per PR.
I will have to google, how do I split this into multiple pr - with one file per PR.
The point here is that I do not believe debounce should be implemented inside this plugin, as cmp already implements debounce. You have yet to convince me it is needed here. If you do manage to, we can merge this as is after addressing the other issues.
@tzachar let me know if all is ok now.
@tzachar let me know if all is ok now.
see pending issues in the review.
@tzachar let me know if all is ok now.
see pending issues in the review. Ah, ok I though it is up to you to accept the changes and mark them as resolved. Will do it right now.
+1 for fast merge. Maybe the debounce stuff should be done as a PR to cmp then?
+1 for fast merge. Maybe the debounce stuff should be done as a PR to
cmpthen?
Waiting for the last issue to be resolved.