cmp-ai icon indicating copy to clipboard operation
cmp-ai copied to clipboard

Debounce, LlamaCpp support, expose prompt as setup option, fix passing parameters to model (ollama)

Open JoseConseco opened this issue 2 years ago • 9 comments

Hi, This pr fixes https://github.com/tzachar/cmp-ai/issues/8

And gives u ability to customize prompt, and adds debounce delay: image

debounce_delay - request will be send after x ms, after last key press. And added support for Llama Server

JoseConseco avatar Dec 19 '23 15:12 JoseConseco

+1 for fixing the ollama params, without this, you can't change it to a remote host.

ALameLlama avatar Dec 25 '23 05:12 ALameLlama

@JoseConseco Any updates?

tzachar avatar Jan 10 '24 09:01 tzachar

I will have to google, how do I split this into multiple pr - with one file per PR.

JoseConseco avatar Jan 10 '24 13:01 JoseConseco

I will have to google, how do I split this into multiple pr - with one file per PR.

The point here is that I do not believe debounce should be implemented inside this plugin, as cmp already implements debounce. You have yet to convince me it is needed here. If you do manage to, we can merge this as is after addressing the other issues.

tzachar avatar Jan 11 '24 07:01 tzachar

@tzachar let me know if all is ok now.

JoseConseco avatar Jan 12 '24 15:01 JoseConseco

@tzachar let me know if all is ok now.

see pending issues in the review.

tzachar avatar Feb 08 '24 08:02 tzachar

@tzachar let me know if all is ok now.

see pending issues in the review. Ah, ok I though it is up to you to accept the changes and mark them as resolved. Will do it right now.

JoseConseco avatar Feb 09 '24 16:02 JoseConseco

+1 for fast merge. Maybe the debounce stuff should be done as a PR to cmp then?

mdietrich16 avatar Mar 12 '24 21:03 mdietrich16

+1 for fast merge. Maybe the debounce stuff should be done as a PR to cmp then?

Waiting for the last issue to be resolved.

tzachar avatar Mar 13 '24 09:03 tzachar