codecompanion.nvim
codecompanion.nvim copied to clipboard
[Bug]: OpenAI-compatible adapter doesn't work with some API providers due to not handling empty content fields
Your minimal.lua
config
---@diagnostic disable: missing-fields
--NOTE: Set config path to enable the copilot adapter to work.
--It will search the follwoing paths for the for copilot token:
-- - "$CODECOMPANION_TOKEN_PATH/github-copilot/hosts.json"
-- - "$CODECOMPANION_TOKEN_PATH/github-copilot/apps.json"
vim.env["CODECOMPANION_TOKEN_PATH"] = vim.fn.expand("~/.config")
vim.env.LAZY_STDPATH = ".repro"
load(vim.fn.system("curl -s https://raw.githubusercontent.com/folke/lazy.nvim/main/bootstrap.lua"))()
-- Your CodeCompanion setup
local plugins = {
{
"olimorris/codecompanion.nvim",
dependencies = {
{ "nvim-treesitter/nvim-treesitter", build = ":TSUpdate" },
{ "nvim-lua/plenary.nvim" },
{ "hrsh7th/nvim-cmp" },
{ "stevearc/dressing.nvim", opts = {} },
{ "nvim-telescope/telescope.nvim" },
},
opts = {
--Refer to: https://github.com/olimorris/codecompanion.nvim/blob/main/lua/codecompanion/config.lua
strategies = {
--NOTE: Change the adapter as required
chat = { adapter = "openai_compatible" },
inline = { adapter = "copilot" },
},
opts = {
log_level = "DEBUG",
},
adapters = {
openai_compatible = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "https://glhf.chat",
api_key = "GLHF_API_KEY",
chat_url = "/api/openai/v1/chat/completions",
},
schema = {
model = {
default = "hf:Qwen/Qwen2.5-Coder-32B-Instruct",
},
num_ctx = {
default = 32768,
},
},
})
end,
},
},
},
}
require("lazy.minit").repro({ spec = plugins })
-- Setup Tree-sitter
local ts_status, treesitter = pcall(require, "nvim-treesitter.configs")
if ts_status then
treesitter.setup({
ensure_installed = { "lua", "markdown", "markdown_inline", "yaml" },
highlight = { enable = true },
})
end
-- Setup completion
local cmp_status, cmp = pcall(require, "cmp")
if cmp_status then
cmp.setup({
mapping = cmp.mapping.preset.insert({
["<C-b>"] = cmp.mapping.scroll_docs(-4),
["<C-f>"] = cmp.mapping.scroll_docs(4),
["<C-Space>"] = cmp.mapping.complete(),
["<C-e>"] = cmp.mapping.abort(),
["<CR>"] = cmp.mapping.confirm({ select = true }),
-- Accept currently selected item. Set `select` to `false` to only confirm explicitly selected items.
}),
})
end
Error messages
No error messages, just a borked chat history where user and assistant messages aren't correctly separated.
Log output
Seems empty?
Health check output
Describe the bug
What I expect to happen:
The user and assistant messages are properly separated and the headers appear
What actually happens:
FYI, this is due to some OpenAI-compatible API providers returning an empty content
field for some deltas; CodeCompanion already handles nil role
fields, but not nil content
fields in deltas. I have a fork with a fix and am happy to make a PR, it's a small change! Opening this issue first as per the CONTRIBUTING.md.
Reproduce the bug
- Get an API key from Fireworks or GLHF.chat
- Use it for the OpenAI-compatible API provider
- CodeCompanion will break
Final checks
- [X] I have made sure this issue exists in the latest version of the plugin
- [X] I have tested with the
minimal.lua
file from above and have shared this - [X] I have shared the contents of the log file