nohz.afk
nohz.afk
check for running fish in vterm also works. ```fish if [ "$INSIDE_EMACS" = vterm ] function vterm_prompt_end --on-variable PWD vterm_printf '51;A'(whoami)'@'(hostname)':'(pwd) end end ```
For anyone interested in this topic, I have implemented the desired functionality using a Python script called [devcontainer-cli-port-forwarder](https://github.com/nohzafk/devcontainer-cli-port-forwarder). This solution is based on @chrmarti's suggestion to utilize `socat` for port...
i can confirm that remove the gutter setting will solve the margin issue `src/main.ts` ```diff - foldGutter: { - minFoldSize: 1, - }, - foldOptions: { - widget: " ...",...
I tried, I can open the org file on both cases, but the format is not rendered correctly.
I use LM Studio to start a local server with success, you can try to use http://localhost:11434 in the externsion options API URL.
It has something to do with the prompt format and openai api compatability, LM Studio can handle them. I don't have experience with ollama, so you might need to figure...
I definetely think supporting local LLM is the ideal choice. The task is well-suited for a small local LLM. Any contributions are welcome! 👍
I believe that [Candle](https://github.comuggingface/candle) is an excellent choice, and I recommend considering support it. Candle primarily focuses on serverless inference and provides the ability to run models within browsers using...
I hold the identical opinion; I attempted the Microsoft/Phi-2 model (very small around 2.7G), but unfortunately, it did not perform well on this classification task.