copilot.el icon indicating copy to clipboard operation
copilot.el copied to clipboard

Add option to switch completion model

Open tian-yi opened this issue 11 months ago • 11 comments

Github announced a new code completion model based on 4o-mini, which would be great to add to copilot.el. Thanks.

https://github.blog/changelog/2025-02-18-new-gpt-4o-copilot-code-completion-model-now-available-in-public-preview-for-copilot-in-vs-code/

https://docs.github.com/en/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-code-completion

tian-yi avatar Mar 14 '25 21:03 tian-yi

I agree that would be a nice improvement. I hope I'll get to working on this eventually, but in the mean time - PRs welcome!

bbatsov avatar Apr 10 '25 11:04 bbatsov

Can't confirm yet but you can give it a try by adding

{
  editorConfiguration: {
    github: {
      copilot: {
        selectedCompletionModel: 'gpt-4o-mini';
      }
    }
  }
}

to setEditorInfo parameters here https://github.com/copilot-emacs/copilot.el/blob/f0e4ce61ba7565f9eb9393e88bd88868872a7d4f/copilot.el#L480-L484

sending

(copilot--request 'copilot/models '())

should give a list of available models.

EDITED: the selectedCompletionModel should be gpt-4o-copilot instead of gpt-4o-mini above

seed more available model id in https://github.com/copilot-emacs/copilot.el/issues/382#issuecomment-2803851015

knilink avatar Apr 12 '25 16:04 knilink

Based on these sources:

It seems they're using gpt-4o-copilot. I wonder which is more correct - gpt-4o-copilot or gpt-4o-mini? They might be the same thing.

Either way, this should probably be made the default behavior.

ncaq avatar Apr 14 '25 03:04 ncaq

It seems that gpt-4o-copilot is the likely option, though both may work. A quick search suggests that gpt-4o-mini appears to be intended for Copilot Chat.

    capabilities: {
      family: "gpt-3.5-turbo",
      object: "model_capabilities",
      supports: { streaming: !0 },
      tokenizer: "cl100k_base",
      type: "completion",
    },
    id: "copilot-codex",
    model_picker_enabled: !0,
    name: "GPT-3.5 Turbo",
    object: "model",
    preview: !1,
    version: "copilot-codex",
  },
  Uit = "gpt-4o-copilot",

ncaq avatar Apr 15 '25 04:04 ncaq

I see, so the id is "gpt-4o-copilot" and the equivalent of the name is "gpt-4o-mini".
And because it's specified with :scopes ["completion"], it gets picked by the LSP server.
I spent a while digging through the bundled JavaScript, but I couldn’t figure out how to specify this in the initializationOptions of the Initialization method.
I have a feeling that if you enable Future Preview in the GitHub settings, it gets selected by default.
Really?

ELISP> (copilot--request 'copilot/models dummy-request)
[(:modelFamily "gpt-4o" :modelName "GPT-4o" :scopes
               ["chat-panel" "edit-panel" "inline"]
               :id "gpt-4o" :preview :json-false :capabilities
               (:supports
                 (:vision t)))
 (:modelFamily "o1-ga" :modelName "o1 (Preview)" :scopes
               ["chat-panel" "edit-panel" "inline"]
               :id "o1" :preview t :capabilities
               (:supports
                 (:vision :json-false)))
 (:modelFamily "o3-mini" :modelName "o3-mini" :scopes
               ["chat-panel" "edit-panel" "inline"]
               :id "o3-mini" :preview :json-false :capabilities
               (:supports
                 (:vision :json-false)))
 (:modelFamily "claude-3.5-sonnet" :modelName "Claude 3.5 Sonnet" :modelPolicy
               (:state "enabled" :terms "Enable access to the latest Claude 3.5 Sonnet model from Anthropic. [Learn more about how GitHub Copilot serves Claude 3.5 Sonnet](https://docs.github.com/copilot/using-github-copilot/using-claude-sonnet-in-github-copilot).")
               :scopes
               ["chat-panel" "edit-panel" "inline"]
               :id "claude-3.5-sonnet" :preview :json-false :capabilities
               (:supports
                 (:vision t)))
 (:modelFamily "claude-3.7-sonnet" :modelName "Claude 3.7 Sonnet" :modelPolicy
               (:state "enabled" :terms "Enable access to the latest Claude 3.7 Sonnet model from Anthropic. [Learn more about how GitHub Copilot serves Claude 3.7 Sonnet](https://docs.github.com/copilot/using-github-copilot/using-claude-sonnet-in-github-copilot).")
               :scopes
               ["chat-panel" "edit-panel" "inline"]
               :id "claude-3.7-sonnet" :preview :json-false :capabilities
               (:supports
                 (:vision t)))
 (:modelFamily "claude-3.7-sonnet-thought" :modelName "Claude 3.7 Sonnet Thinking" :modelPolicy
               (:state "enabled" :terms "Enable access to the latest Claude 3.7 Sonnet model from Anthropic. [Learn more about how GitHub Copilot serves Claude 3.7 Sonnet](https://docs.github.com/copilot/using-github-copilot/using-claude-sonnet-in-github-copilot).")
               :scopes
               ["chat-panel" "edit-panel" "inline"]
               :id "claude-3.7-sonnet-thought" :preview :json-false :capabilities
               (:supports
                 (:vision t)))
 (:modelFamily "gemini-2.0-flash" :modelName "Gemini 2.0 Flash" :modelPolicy
               (:state "enabled" :terms "Enable access to the latest Gemini models from Google. [Learn more about how GitHub Copilot serves Gemini 2.0 Flash](https://docs.github.com/en/copilot/using-github-copilot/ai-models/using-gemini-flash-in-github-copilot).")
               :scopes
               ["chat-panel" "edit-panel" "inline"]
               :id "gemini-2.0-flash-001" :preview :json-false :capabilities
               (:supports
                 (:vision t)))
 (:modelFamily "gpt-4o-mini" :modelName "GPT-4o Copilot" :scopes
               ["completion"]
               :id "gpt-4o-copilot" :preview :json-false :capabilities
               (:supports
                 (:vision :json-false)))]

ncaq avatar Apr 15 '25 05:04 ncaq

@ncaq yeah, the id gpt-4o-mini i provided above was purely by guessing

the selectedCompletionModel setting needs to be model id in the list you just commented which mean it's supposed to be gpt-4o-copilot

if it's incorrect then the log buffer *copilot-language-server-log* should print an error log say User selected model wrong-model is not in the list of generic models: model-1, model-2, ..., falling back to default model. which is also a way to know the available models

knilink avatar Apr 15 '25 05:04 knilink

When examining the output of copilot-language-server-log, I noticed the following:

[lsp] GitHub Copilot Language Server 1.302.0 initialized
[certificates] Removed 1 expired certificates
[fetchCompletions] request.response: [https://proxy.business.githubcopilot.com/v1/engines/gpt-4o-copilot/completions] took 287.7631450000008 ms

This leads me to believe that GPT-4o may indeed now be running by default.

This aligns with my earlier thoughts when I was tracing the official language server behavior. Although we don't have the option to select it, I suspect GPT-3 based completions will be discontinued soon anyway, so it probably isn't necessary to support something that's going to be deprecated.

ncaq avatar Apr 21 '25 02:04 ncaq

I managed to attempt setting a different model with: (setq copilot-lsp-settings '(:github (:copilot (:selectedCompletionModel "claude-3.7-sonnet")))) but in copilot-language-server-log I got: [default] User selected model claude-3.7-sonnet is not in the list of generic models: gpt-4o-copilot, falling back to default model. I think we can chose editor completion only from models which include :scopes ["completion"], today it is only gpt-4o-mini

mspik avatar Apr 23 '25 10:04 mspik

wish to use claude-3.7-sonnet in emacs

getong avatar Apr 28 '25 03:04 getong

Just to leave an updated set of commands as of today

(copilot--request 'copilot/models '(:dummy "dummy")) 
(setq copilot-lsp-settings '(:github (:copilot (:selectedCompletionModel "gpt-41-copilot"))))

indigoviolet avatar Oct 08 '25 18:10 indigoviolet

wish to use claude-3.7-sonnet in emacs

  • claude-4.5-sonnet

  • gemini 2.5 pro

mustafaabobakr avatar Nov 16 '25 11:11 mustafaabobakr