llm.nvim
llm.nvim copied to clipboard
LLM powered development for Neovim
Hi there, thanks for the cool plugin. I noticed an issue caused by how you merge the default config with the user provided config in [config.lua](https://github.com/huggingface/llm.nvim/blob/51b76dac9c33c0122adfe28daf52ceaa31c4aa02/lua/llm/config.lua#L72). I usually use ollama...
I built llm-ls locally, and config it as following: ` lsp = { bin_path = "/home/myuser/soft/GPT/llm-ls-0.5.2/target/release/llm-ls", host = nil, port = nil, version = "0.5.2", }, ` However, when opening...
I have managed to get llm autocomplete to work using cmp-ai. But then I lose basic lsp support (intellisense)
When the configured LLM server is unreachable the UI is blocked by the error messages, making typing extremely slow. This can of course be negated by disabling generation, but maybe...
In order to set environment variable only to the LSP server, this set cmd_env from config if available. Reference: https://neovim.io/doc/user/lsp.html#vim.lsp.ClientConfig --- Example usage: See below example lazy.vim plugin configuration. with...
Only map normal or insert if configured
Extremely slow on my Mac use M2 chip. Only use a very light startcoder2 3b model. I can see a very high GPU or CPU usage and Idle wake Ups....
 As you see i'm getting this weird warning and no completion when trying to write cpp, this is my llm.nvim setup using lazy ```lua { 'huggingface/llm.nvim', opts = {...
I am trying to run Starcoder locally through Ollama. And I want to get code auto-completion like in the README gif. But I keep getting the following error after every...