llm-ls
llm-ls copied to clipboard
LSP server leveraging LLMs for code completion (and more?)
- add support for https://github.com/ggerganov/llama.cpp (located at `examples/server`) - fix inverted condition that was causing all ranges to be invalid
https://github.com/huggingface/llm-ls/blob/f58085b8127dc09ae09645e04019a668a85e4976/crates/llm-ls/src/main.rs#L795 `Instant::now()` maybe return a value smaller than `MAX_WARNING_REPEAT` (3600s) in my Win11, and it will cause the `checked_sub` return None, that make program show error message "instant to be...
Hello, i'm having completions that do not display, and I've managed to track this to the should_complete function. Here's what happens: ```python def test(): if {cursor_position} ``` In this case,...
I'm using neovim with llm.nvim and I'm getting this error when calling `LLMSuggestion` command: ```[LLM] missing field `request_params` ``` llm.nvim config: ``` require('llm').setup({ backend = "ollama", model = "llama3:text", url...
I have the plugin installed and configured. I see ghost text followed by "^M" but I do not seem to be able to figure out how to accept the suggestion....
https://github.com/huggingface/llm-ls/blob/2a433cdf75dc0a225e95753256f2601161bc6747/crates/testbed/src/main.rs#L346C24-L346C24 The linked statement results in the following error. ```text error[E0425]: cannot find function `symlink` in module `fs` --> crates\testbed\src\main.rs:346:21 | 346 | fs::symlink(link_target, dst_path.clone()).await?; | ^^^^^^^ not found in...
When trying to use deepseek coder (via ollama) and its tokenizer and tokens for fim, the result seems completely irrelevant (or, maybe, cut off). However, when using the prompt I...
Proof-of-concept: https://github.com/blmarket/llm-ls Hi, I'm wondering llm-ls can incorporate dedicated LLM server provider within LSP server, preferably as a shared instance via [daemonize](https://crates.io/crates/daemonize). Idea was inspired by [bazel client/server](https://bazel.build/run/client-server). It works...
Handle the fact that they may be multiple initialized workspaces requiring completions. Will add the initialization of new workspaces in a new PR
After installing the LS it seems that it ignores XDG environment variables, specifically `${XDG_CACHE_HOME}` variable. It has `~/.cache` directory hard coded. I suggest adding a functionality that would check for...