Results 25 comments of Sang

I believe that supporting configurable summary models is important, as enforcing a specific model will only lead to more issues (for instance, using `gpt-4o (Azure)` now still sends summary requests...

I compiled and test a version locally, is great. But I would like to raise a few minor requests: 1. Can it automatically focus on the first search box? (Perhaps...

Oh, and the query parameter doesn't seem to be working. ![image](https://github.com/glanceapp/glance/assets/49085035/e3e060b1-f388-4339-b030-f777220dd4e0) ![image](https://github.com/glanceapp/glance/assets/49085035/4b41b5a2-4178-4ecb-88f8-6113b3e7d75f)

> prefix = "", > middle = "", > suffix = "", not work for me (deepseek-coder-v2:16b): ``` require('llm').setup({ backend = "ollama", -- backend ID, "huggingface" | "ollama" | "openai"...

> I think you can try a base model (such as `deepseek-coder-v2:16b-lite-base-q4_0`) instead of an instruct model I eventually found that in the scenario of using ollama, some changes need...