avante.nvim icon indicating copy to clipboard operation
avante.nvim copied to clipboard

bug:

Open thewinger opened this issue 7 months ago • 5 comments

Describe the bug

I have configured a local network ollama as provider with devstral:latest model. When I try to use Avante it always shows the error:

Error: API request failed with status 400. Body: '{"error":"json: cannot unmarshal string into Go struct field ChatRequest.messages.tool_calls.function.arguments of type api.ToolCallFunctionArguments"}'

I am not sure what I need to configure to make it work.

To reproduce

No response

Expected behavior

No response

Installation method

Use lazy.nvim:

return {
  {
    "yetone/avante.nvim",
    event = "VeryLazy",
    lazy = false,
    version = false, -- set this if you want to always pull the latest change

    -- if you want to build from source then do `make BUILD_FROM_SOURCE=true`
    build = "make",
    -- build = "powershell -ExecutionPolicy Bypass -File Build.ps1 -BuildFromSource false" -- for windows
    dependencies = {
      "nvim-treesitter/nvim-treesitter",
      "stevearc/dressing.nvim",
      "nvim-lua/plenary.nvim",
      "MunifTanjim/nui.nvim",
      --- The below dependencies are optional,
      "saghen/blink.cmp",
      "echasnovski/mini.icons",
      "folke/snacks.nvim",
      {
        -- support for image pasting
        "HakonHarnes/img-clip.nvim",
        event = "VeryLazy",
        opts = {
          -- recommended settings
          default = {
            embed_image_as_base64 = false,
            prompt_for_file_name = false,
            drag_and_drop = {
              insert_mode = true,
            },
            -- required for Windows users
            use_absolute_path = true,
          },
        },
      },
      {
        -- Make sure to set this up properly if you have lazy=true
        "MeanderingProgrammer/render-markdown.nvim",
        dependencies = { "nvim-treesitter/nvim-treesitter", "echasnovski/mini.nvim" }, -- if you use the mini.nvim suite
        ---@module 'render-markdown'
        -- @type render.md.UserConfig
        opts = {
          completions = {
            lsp = { enabled = true },
            blink = { enabled = true },
          },
          file_types = { "markdown", "Avante" },
          -- log_level = "debug",
          -- overrides = {
          --   buftype = {
          --     nofile = {
          --       render_modes = { "n", "c", "i" },
          --       debounce = 5,
          --     },
          --   },
          -- filetype = {},
          -- },
        },
        ft = { "markdown", "Avante" },
      },
    },
    config = function()
      require("avante").setup({
        provider = "ollama",
        claude = {
          endpoint = "https://api.anthropic.com",
          -- model = "claude-3-7-sonnet-20250219",
          model = "claude-3-5-sonnet-20241022",
          temperature = 0,
          timeout = 30000,
          max_tokens = 4096,
          -- disable_tools = true,
        },
        ollama = {
          endpoint = "http://192.168.1.70:11434",
          model = "devstral:latest",
          -- model = "gemma3:27b",
          options = {
            temperature = 0,
            num_ctx = 32768,
          },
        },
        behaviour = {
          enable_cursor_planning_mode = true, -- enable cursor planning mode!
        },
        file_selector = {
          -- @alias FileSelectorProvider "native" | "fzf" | "mini.pick" | "snacks" | "telescope" | string | fun(params: avante.file_selector.IParams|nil): nil
          provider = "snacks",
          -- Options override for custom providers
          -- provider_opts = {},
        },
        -- The system_prompt type supports both a string and a function that returns a string. Using a function here allows dynamically updating the prompt with mcphub
        system_prompt = function()
          local hub = require("mcphub").get_hub_instance()
          return hub:get_active_servers_prompt()
        end,
        -- The custom_tools type supports both a list and a function that returns a list. Using a function here prevents requiring mcphub before it's loaded
        custom_tools = function()
          return {
            require("mcphub.extensions.avante").mcp_tool(),
          }
        end,
        -- disabled_tools = {
        --   "list_files",
        --   "search_files",
        --   "read_file",
        --   "create_file",
        --   "rename_file",
        --   "delete_file",
        --   "create_dir",
        --   "rename_dir",
        --   "delete_dir",
        --   "bash",
        -- },
      })
    end,
  },
  {
    "ravitemer/mcphub.nvim",
    dependencies = {
      "nvim-lua/plenary.nvim", -- Required for Job and HTTP requests
    },
    -- cmd = "MCPHub", -- lazily start the hub when `MCPHub` is called
    build = "npm install -g mcp-hub@latest", -- Installs required mcp-hub npm module
    config = function()
      require("mcphub").setup({
        -- Required options
        port = 3050, -- Port for MCP Hub server
        config = vim.fn.expand("/Users/win/.config/nvim/mcpservers.json"), -- Absolute path to config file

        -- Optional options
        on_ready = function(hub)
          -- Called when hub is ready
        end,
        on_error = function(err)
          -- Called on errors
        end,
        log = {
          level = vim.log.levels.DEBUG, -- More verbose logging
          to_file = true, -- Enable file logging
          file_path = "/Users/win/logs/mcphub.log", -- Custom log path
          prefix = "MCPHub", -- Log prefix
        },
        auto_approve = true,
        extensions = {
          avante = {
            make_slash_commands = true, -- make /slash commands from MCP server prompts
          },
        },
      })
    end,
  },
}

Environment

Neovim version: NVIM v0.11.1 Build type: Release LuaJIT 2.1.1744318430 Run "nvim -V1 -v" for more info

Platform: Darwin winMBP.local 24.5.0 Darwin Kernel Version 24.5.0: Tue Apr 22 19:54:49 PDT 2025; root:xnu-11417.121.6~2/RELEASE_ARM64_T6000 arm64

Repro


thewinger avatar May 27 '25 07:05 thewinger

Got this working myself swapping endpoint = "http://192.168.1.70:11434", for endpoint = "http://localhost:11434", Its pretty slow but still impressive what can be done purely on your own machine.

mrf-hayden avatar May 28 '25 22:05 mrf-hayden

@mrf-hayden I can't do that. My endpoint is not in the same computer as avante.nvim, it is in another device in my LAN.

thewinger avatar May 29 '25 06:05 thewinger

~~Is your ollama listening on 0.0.0.0? I had trouble permanently changing it from localhost when installed with homebrew. Works for me now~~

EDIT: I've just realized that you are getting 400, not 5xx back. Sorry mate, I don't have other ideas

burmajam avatar May 29 '25 09:05 burmajam

Should update that works is relative here, I'm getting decent if not slow responses back from the model when asking questions about the code, but it seems to be struggling to use tools correctly, will describe what tools it has access to but then fail to make those tool calls or hallucinates that it made the tool call when it didn't.

mrf-hayden avatar May 29 '25 15:05 mrf-hayden

Seems my issue might be a known issue with Ollama https://github.com/ollama/ollama/issues/9632#issuecomment-2849475887 Guess my dream of a fully opensource AI-assisted coding will have to wait a little longer

mrf-hayden avatar May 29 '25 16:05 mrf-hayden

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Jun 29 '25 02:06 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Jul 05 '25 02:07 github-actions[bot]