llm.nvim icon indicating copy to clipboard operation
llm.nvim copied to clipboard

Default model (starcoder2-15b) error with `llm-ls`

Open Gilk260 opened this issue 9 months ago • 0 comments

I'm trying to use the default configuration and run model from Hugging Face but the adult model seems to be not reachable.

Here is my configuration: NVIM v0.10.4 Ubuntu 24.04.1 LTS

init.lua:

return {
  {
    'huggingface/llm.nvim',
    ft = { "python" },
    opts = require("configs.llm")
  },
}

configs/llm.lua:

return {
  tokens_to_clear = { "<|endoftext|>" },
  fim = {
    enabled = true,
    prefix = "<fim_prefix>",
    middle = "<fim_middle>",
    suffix = "<fim_suffix>",
  },
  model = "bigcode/starcoder2-15b",
  context_window = 8192,
  tokenizer = nil, -- also tried the repository way
  lsp = {
    bin_path = vim.api.nvim_call_function("stdpath", { "data" }) .. "/mason/bin/llm-ls",
    cmd_env = { LLM_LOG_LEVEL = "DEBUG" }
  },
}

As you can see this is the default config to run StarCoder2-15b from HF, I have installed llm-ls from Mason, and I have setup env variable LLM_NVIM_HF_API_TOKEN with the Write token (I know I can restrict permissions).

Here is the error message in the log file:

{"timestamp":"2025-06-06T14:56:16.914325Z","level":"ERROR","err_msg":"serde json error: expected value at line 1 column 1","target":"llm_ls::error","line_number":8,"spans":[{"request_id":"e6c485d2-afaa-484a-a068-49b8ff11a4e0","name":"completion_request"}]}

Gilk260 avatar Feb 03 '25 11:02 Gilk260