bug: Cannot use o1 from github copilot.
Describe the bug
Cannot use o1 from github copilot. I ithink that CopilotChat.nvim, which probably uses api.githubcopilot.com, can use o1, and hence, it may also be possible to use it in avante.nvim. Could you please confirm?
To reproduce
setting
require("avante").setup({
provider = "copilot",
copilot = {
endpoint = "https://api.githubcopilot.com/",
model = "o1",
-- claude can use with avante
-- model = "claude-3.5-sonnet",
proxy = nil, -- [protocol://]host[:port] Use this proxy
allow_insecure = false, -- Allow insecure server connections
timeout = 30000, -- Timeout in milliseconds
temperature = 0,
max_tokens = 8192,
},
-- auto_suggestions_provider = "copilot",
behaviour = {
auto_suggestions = false, -- Experimental stage
auto_set_highlight_group = true,
auto_set_keymaps = false,
auto_apply_diff_after_generation = false,
support_paste_from_clipboard = true,
- execute AvanteAsk
- input text
- execute avante
Expected behavior
Working avante with o1 model.
Installation method
Use lazy.nvim:
{
"yetone/avante.nvim",
event = "VeryLazy",
lazy = false,
version = false, -- set this if you want to always pull the latest change
opts = {
-- add any opts here
},
-- if you want to build from source then do `make BUILD_FROM_SOURCE=true`
build = "make",
-- build = "powershell -ExecutionPolicy Bypass -File Build.ps1 -BuildFromSource false" -- for windows
dependencies = {
"nvim-treesitter/nvim-treesitter",
"stevearc/dressing.nvim",
"nvim-lua/plenary.nvim",
"MunifTanjim/nui.nvim",
},
}
Environment
nvim: 0.10.2 OS: macOS avante: latest make: executed
Repro
vim.env.LAZY_STDPATH = ".repro"
load(vim.fn.system("curl -s https://raw.githubusercontent.com/folke/lazy.nvim/main/bootstrap.lua"))()
require("lazy.minit").repro({
spec = {
-- add any other plugins here
},
})
require("avante").setup({
provider = "copilot",
copilot = {
endpoint = "https://api.githubcopilot.com",
model = "o1",
-- claude can use with avante
-- model = "claude-3.5-sonnet",
proxy = nil, -- [protocol://]host[:port] Use this proxy
allow_insecure = false, -- Allow insecure server connections
timeout = 30000, -- Timeout in milliseconds
temperature = 0,
max_tokens = 8192,
},
-- auto_suggestions_provider = "copilot",
behaviour = {
auto_suggestions = false, -- Experimental stage
auto_set_highlight_group = true,
auto_set_keymaps = false,
auto_apply_diff_after_generation = false,
support_paste_from_clipboard = true,
},
any updates? o1 is stuck in "Generatin response...". similar environment setup.
I was using claude-3.5-sonnet which used to work and now am getting unsupported model errors, MS may limited custom model support to enterprise users or changed how the models are referenced on the API. Had to comment the model parameter for now.
I was using claude-3.5-sonnet which used to work and now am getting unsupported model errors, MS may limited custom model support to enterprise users or changed how the models are referenced on the API. Had to comment the model parameter for now.
You might need to check the Copilot chat interface in VSCode to see why this error is reported.
what are the valid model names we can use with the copilot provider? are they documented anywhere?
I swear i remember this being in the wiki previously (using model marketplace providers via copilot), but can't find it now. Would love to find some docs for itt
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.