Sang
Sang
I believe that supporting configurable summary models is important, as enforcing a specific model will only lead to more issues (for instance, using `gpt-4o (Azure)` now still sends summary requests...
I compiled and test a version locally, is great. But I would like to raise a few minor requests: 1. Can it automatically focus on the first search box? (Perhaps...
Oh, and the query parameter doesn't seem to be working.  
> prefix = "", > middle = "", > suffix = "", not work for me (deepseek-coder-v2:16b): ``` require('llm').setup({ backend = "ollama", -- backend ID, "huggingface" | "ollama" | "openai"...
> I think you can try a base model (such as `deepseek-coder-v2:16b-lite-base-q4_0`) instead of an instruct model I eventually found that in the scenario of using ollama, some changes need...