Feature: Add GitHub Copilot as model provider
Issue
Hello!
Please add GitHub Copilot as model provider.
Should be possible like this: https://github.com/olimorris/codecompanion.nvim/blob/5c5a5c759b8c925e81f8584a0279eefc8a6c6643/lua/codecompanion/adapters/copilot.lua
Idea taken from: https://github.com/cline/cline/discussions/660
Thank you!
Version and model info
No response
Seems like it's a standard OpenAI-style API, with an token that is refreshed periodically using a long-lived refrehs token, OAuth style.
The tricky part seems to be actually getting the refresh token in the first place. All the Neovim plugins rely on authenticating through copilot.vim or copilot.lua, which use Github's closed-source Javascript language server. Then once the initial auth step is done, they copy the refresh token from the location in the filesystem where it gets saved and go from there.
The good news is that the refresh token doesn't really seem to ever expire. My token at ~/.config/github-copilot/hosts.json is two years old and still works even though my Copilot subscription lapsed for a time in between.
If we're ok with having Aider pull its refresh token in a similar way, then this should be pretty easy to implement, but the initial setup for the user may be a bit of a hassle.
For additional reference, here's Avante.nvim's implementation which is pretty similar to the one in CodeCompanion.
Great, that doesn't look too complicated, thanks.
Haven't looked into it, is the JS language server the only way to get the token?
Anything we can do with these: https://github.com/settings/tokens?
Haven't looked into it, is the JS language server the only way to get the token?
This code can get the token from github api. Note that it register itself as copilot.vim, as I don't know if copilot allows for other third part clients.
resp = requests.post('https://github.com/login/device/code', headers={
'accept': 'application/json',
'editor-version': 'Neovim/0.6.1',
'editor-plugin-version': 'copilot.vim/1.16.0',
'content-type': 'application/json',
'user-agent': 'GithubCopilot/1.155.0',
'accept-encoding': 'gzip,deflate,br'
}, data='{"client_id":"Iv1.b507a08c87ecfe98","scope":"read:user"}')
# Parse the response json, isolating the device_code, user_code, and verification_uri
resp_json = resp.json()
device_code = resp_json.get('device_code')
user_code = resp_json.get('user_code')
verification_uri = resp_json.get('verification_uri')
# Print the user code and verification uri
print(f'Please visit {verification_uri} and enter code {user_code} to authenticate.')
while True:
time.sleep(5)
resp = requests.post('https://github.com/login/oauth/access_token', headers={
'accept': 'application/json',
'editor-version': 'Neovim/0.6.1',
'editor-plugin-version': 'copilot.vim/1.16.0',
'content-type': 'application/json',
'user-agent': 'GithubCopilot/1.155.0',
'accept-encoding': 'gzip,deflate,br'
}, data=f'{{"client_id":"Iv1.b507a08c87ecfe98","device_code":"{device_code}","grant_type":"urn:ietf:params:oauth:grant-type:device_code"}}')
# Parse the response json, isolating the access_token
resp_json = resp.json()
access_token = resp_json.get('access_token')
if access_token:
break
print('Authentication success: '+access_token)
Thanks for trying aider and filing this issue.
This sounds like it would require using a undocumented API or abusing a documented API for an off-label purpose?
@RodolfoCastanheira Great, that's looks like a simple solution. We should have all the pieces then?
@paul-gauthier I don't know, whether it's undocumented or would be off-label. I mainly forwarded the idea from Cline (see op) and also made the same suggestion to LiteLLM (see mention above).
I think there's a decent argument to be made that it's acceptable, if not officially supported.
- Copilot lets you make "Extensions" which are allowed to call the chat endpoints. Here's an official example from Github that does that
- Elsewhere in the extensions documentation, they mention that the chat endpoint is rate-limited, so the potential for abuse is low.
The main difference between the extensions and what Aider (and the other existing, unofficial integrations) would do is around the auth token management. For extensions the tokens are automatically managed for you and the short-lived access token is passed to each call to the extension code. But in both cases, it's using an OAuth style flow that authenticates as your Github user, and that token stops working if you stop paying.
I can understand the worry for a high-profile project like Aider doing an integration like this. Hopefully this helps clear it up some.
Aider relies on litellm for integrations to LLM API providers, so I think it would be best to focus on getting it implemented there.
That said, it seems unclear that Copilot would sanction this sort of use of their API. So I'm not sure it would be appropriate for litellm (or aider) to add this sort of support.
I'm happy to be shown something that would clarify that this would be a legitimate use of the copilot API.
I've checked the usage policy[1] and the official forum, but couldn't find anything that says third-party integrations are either allowed or forbidden. When someone asks for help with an unofficial integration, the GitHub staff just lets the community respond (they don't jump in and say "no, you can't do that" or "we don't support that").
It seems like personal use for coding is okay, as long as you're not getting out of hand. And it looks like these unofficial integrations are tolerated, even if this usage is not officially endorsed. The silence on the matter seems like a deliberate choice.
[1] https://docs.github.com/en/site-policy/github-terms/github-terms-for-additional-products-and-features#github-copilot
Here's the official documentation on how agents should use their API: https://docs.github.com/en/copilot/building-copilot-extensions/building-a-copilot-agent-for-your-copilot-extension/using-copilots-llm-for-your-agent
I think it's quite a strong sign that they allow such usage.
I'm labeling this issue as stale because it has been open for 2 weeks with no activity. If there are no additional comments, I will close it in 7 days.Note: A bot script made these updates to the issue.
Any update?
The docs above explain how a GitHub Copilot Agent can access the API. I see no indication that 3rd party tools like aider can use it.
Please correct me if I am misunderstanding the docs.
Please correct me if I'm wrong, but my understanding is that in worst-case scenario Aider would need its "gh-copilot-proxy-agent" that would basically proxy requests to LLM from Aider. Main point is that Aider is a great agent and it would be in the spirit of their manual to make it compatible with their APIs one way or another.
I'm labeling this issue as stale because it has been open for 2 weeks with no activity. If there are no additional comments, I will close it in 7 days.
Note: A bot script made these updates to the issue.
I vote in favor. This integration will be really useful for those who already have a Copilot subscription. Regarding “unofficial” integrations, Zed offers Copilot integration for both code completion and chat. They utilize copilot.vim.
I also want to use the github copilot model from aider.
However, I would like to add a note about copilot.vim as an example, that the author, tpope, is an employee of github and is not considered to be using it without permission.
If you want to follow the official policy, you could register aider as a copilot extension and use it as an agent, like @aider xxxx .
I'm labeling this issue as stale because it has been open for 2 weeks with no activity. If there are no additional comments, I will close it in 7 days.
Note: A bot script made these updates to the issue.
Could we get any clues from how CopilotC-Nvim/CopilotChat.nvim implements chat, and use that as a basis for an aider provider?
It's as described in a few comments earlier in this thread. They all piggyback on the official extension to do the auth or have extracted the code from it to do so, and then it's just a normal HTTP endpoint that looks like OpenAI with an OAuth token to be refreshed periodically. From a technical standpoint there's nothing particularly challenging.
But Paul's concern here, as I understand it, is that all the Neovim plugins are using Copilot in a way that is perhaps tolerated but not officially approved, and adding support to Aider would be doing the same. And Aider is
- a high-profile project
- attached to Paul's personal reputation
- automating the request/response cycle in a way that many chat services (like ChatGPT, as opposed to just using the API) explicitly prohibit. And Copilot straddles the line between "API" and "chat service"
so he's not totally comfortable with it. (Apologies if I'm incorrectly representing his stance here.)
That aside, the integration would also need to go into LiteLLM since Aider is just using that for its LLM client. There's an issue for that here.
An official API for calling language models in Github Copilot has been released.
Using this, I think it should be possible to clear up the legal issues that have been a concern.
It seems that recline is implemented using the API provided by the official site (if it is released as a VSCode plugin, it seems that legal issues can be cleared). https://github.com/julesmons/recline
https://code.visualstudio.com/api/extension-guides/language-model-tutorial
I'm labeling this issue as stale because it has been open for 2 weeks with no activity. If there are no additional comments, I will close it in 7 days.
Note: A bot script made these updates to the issue.
Upvote. This would be awesome!
Both Cline and Roo Code are using it now and it works great for sonnet
This would be great.
Hi Guys, I introduce Copilot Proxy, an experimental VS Code extension designed to enable AI assistants like Aider to access GitHub Copilot's language models. It's a proof of concept aiming to explore potentials and limitations (see Disclaimer).
https://github.com/lutzleonhardt/copilot-proxy https://youtu.be/i1I2CAPOXHM
Best thing would be to open an issue at litellm and see if they can add it as a provider. Aider relies on litellm to connect to API providers.
Best thing would be to open an issue at litellm and see if they can add it as a provider. Aider relies on litellm to connect to API providers.
Hi Paul, there is already an issue opened: https://github.com/BerriAI/litellm/issues/6564
updates?
If you want to use it straight away, this seems like a good option. https://github.com/lutzleonhardt/copilot-proxy
Can confirm works well and has direct instructions for aider setup. Recommend diff mode and avoid o1 due to rate limits.