Add GitHub Copilot provider
Main IDEA is GitHub Copilot subscription is excellent and cheap. It can be integrated into Cline, similar to codecompanion.nvim
What problem or use case are you trying to solve? Developers need access to powerful AI models for coding assistance, but enterprise-grade solutions are often expensive. GitHub Copilot offers access to advanced models (including Claude 3.5 Sonnet) at a very competitive price point ($10/month). Projects like Cline have demonstrated that it's possible to integrate with Copilot's API through VS Code's extension system to provide this functionality in other tools.
Describe the UX of the solution you'd like
- Allow users to authenticate with their GitHub Copilot subscription
- Integrate Copilot's AI capabilities directly into the All-Hands development environment:
- Code completion
- Code explanation
- Natural language to code conversion
- AI-assisted problem solving
- Maintain the existing All-Hands workflow while enhancing it with Copilot's capabilities
Do you have thoughts on the technical implementation? Based on successful implementations in other projects:
- Leverage VS Code's extension API to connect with Copilot:
- Reference: CodeCompanion's Copilot Integration
- Use GitHub's OAuth token system for authentication
- Implement similar to how Cline achieved it (Discussion #660)
- Key components needed:
- GitHub OAuth authentication - VS Code extension API integration - Copilot API wrapper - Token management system
Describe alternatives you've considered
- Direct OpenAI API integration (more expensive)
- Anthropic Claude API (requires separate enterprise contract)
- Custom AI solution (requires significant development resources)
GitHub Copilot integration offers the best balance of:
- Cost-effectiveness ($10/month per user)
- Access to premium models (Claude 3.5 Sonnet)
- Enterprise-ready security
- Proven implementation path
Additional context Several projects have successfully implemented this integration:
- Cline has working implementation
- CodeCompanion provides open-source reference
- Growing community of developers using this approach
This would provide All-Hands users with enterprise-grade AI capabilities while maintaining reasonable costs and leveraging existing GitHub infrastructure.
If you find this feature request or enhancement useful, make sure to add a 👍 to the issue
Thank you for the proposal. In the link provided, I don't see that Cline has implemented it, the issue is linked to a PR that has been closed, not merged.
We have seen previously another proposal for a github workaround, but that code was never going to work. The example linked here is better, but it still needs to get a token in a way that feels like getting around the normal use of Copilot?
More importantly, we don't connect to models directly, we use litellm to connect to models. There is an open issue on this: https://github.com/BerriAI/litellm/issues/6564 on litellm repo. I think maybe it's best to focus the discussion there. If litellm finds a possible way, we'll get it implicitly.
FWIW last version of Cline added support for GitHub Copilot
More importantly, we don't connect to models directly, we use litellm to connect to models.
Besides authentication with Github, the API is 100% openai-like. The difficult is indeed implementing the mechanism for oath with GitHub to get the api token. Cline used some vscode api, so I presume Open Hands would have to use a different approach, maybe like the vim plugins that support copilot.
Anyway, because of the authentication with Github, I don't expect litellm to support it directly (seems out of scope for them). But if you code this authentication and get the token, you could use litellm to talk to the api just like any other openai compatible provider.
@openhands-agent do you have some solutions?
Just found this as a workaround: https://github.com/jjleng/copilot-more It sets up an endpoint locally for you to bind to.
Thank you for the proposal. In the link provided, I don't see that Cline has implemented it, the issue is linked to a PR that has been closed, not merged.
It has now been implemented according to a collaborator: https://github.com/cline/cline/discussions/660#discussioncomment-12025668
@enyst It seems litellm has merged a PR to support the feature last week. https://github.com/BerriAI/litellm/pull/8577/files and https://github.com/BerriAI/litellm/pull/9371 The PR is merged, but still not released to litellm main branch. You can find the status in the thread you mentioned previously.
I'd like to pick the issue to adopt it to openhands if you are not working on it. I will start once it's released to litellm main. 😊Could you please assign this issue to me if possible?
@AutoLTX I appreciate your contributions! This issue is tricky though.
If it's merged in litellm, there should be nothing for us to do to support it: it should already work.
That is the case with many of the dozens and dozens of other models.
But in this case, it's weird, because of how this works specifically: it seems it'll be like, log in with copilot, it saves some credentials file in the user's home directory, then read that file and send in a token for vscode or neovim.
So if you run in development mode, after you login with copilot, it may work (after the litellm update). However, if you run with docker run, this flow won't work.
In my opinion, we should leave it at that. But if you think we should do something more, I suggest we could ask in slack, to see what others think.
Perhaps important, github also makes models available in a different way: https://github.com/marketplace?type=models
What do you think, could this be good enough?
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.
Hi team, first off, thank you for building OpenHands — it’s a fantastic foundation for AI orchestration.
I’m requesting support for a new provider: GitHub Models API (https://github.com/marketplace?type=models), which allows authenticated access to models like GPT-4.1 via a github API — similar to OpenAI or Anthropic.
Why This Matters:
GitHub has started offering free and pay-per-use access to high-quality models (OpenAI, Mistral, etc.) through a standardized API.
Integrating GitHub Models would provide more deployment flexibility, failover redundancy, and provider diversity.
Like OpenAI, it uses a /chat/completions interface, making adaptation likely straightforward.
GitHub’s Models API already supports per-user PAT authentication, rate limits, and multiple model providers under one roof.
Proposal:
Implement GitHub Models API as a selectable provider in the same way as:
OpenAI Mistral Groq Perplexity
The integration could:
Accept GITHUB_PAT for authentication.
Allow specifying model=publisher/model-name as required by the API.
Optionally expose a custom endpoint override.
Reference:
Model Catalog: https://github.com/marketplace?type=models
Thanks!
Hi team, first off, thank you for building OpenHands — it’s a fantastic foundation for AI orchestration.
I’m requesting support for a new provider: GitHub Models API (https://github.com/marketplace?type=models), which allows authenticated access to models like GPT-4.1 via a github API — similar to OpenAI or Anthropic.
Why This Matters:
GitHub has started offering free and pay-per-use access to high-quality models (OpenAI, Mistral, etc.) through a standardized API.
Integrating GitHub Models would provide more deployment flexibility, failover redundancy, and provider diversity.
Like OpenAI, it uses a /chat/completions interface, making adaptation likely straightforward.
GitHub’s Models API already supports per-user PAT authentication, rate limits, and multiple model providers under one roof.
Proposal:
Implement GitHub Models API as a selectable provider in the same way as:
OpenAI Mistral Groq Perplexity
The integration could:
Accept GITHUB_PAT for authentication.
Allow specifying model=publisher/model-name as required by the API.
Optionally expose a custom endpoint override.
Reference:
Model Catalog: https://github.com/marketplace?type=models
Thanks!
Github Models is only meant for playground/experimentation, it's a low rate limit and isn't meant for use as a coding assistant or longer-running tasks. It's meant to drive people to signing up for Azure's AI services.
This also isn't related to github copilot.
@AutoLTX I appreciate your contributions! This issue is tricky though.
If it's merged in litellm, there should be nothing for us to do to support it: it should already work.
That is the case with many of the dozens and dozens of other models.
But in this case, it's weird, because of how this works specifically: it seems it'll be like, log in with copilot, it saves some credentials file in the user's home directory, then read that file and send in a token for vscode or neovim.
So if you run in development mode, after you login with copilot, it may work (after the litellm update). However, if you run with
docker run, this flow won't work.In my opinion, we should leave it at that. But if you think we should do something more, I suggest we could ask in slack, to see what others think.
Perhaps important, github also makes models available in a different way: https://github.com/marketplace?type=models
What do you think, could this be good enough?
@AutoLTX as well.
See my previous comment on the Github models marketplace.
I also have one of the LiteLLM forks for Github copilot running and it's working amazingly. To avoid making changes to OpenHands I have it running as a separate instance and I'm using litellm_proxy/ for the provider.
If there's any desire, I can write up a guide for using a different branch of litellm while we await it being directly merged, then any effort we want can be spent on getting the core code merged upstream so things just work in the future!
Wait so it's already possible to use open hands with github copilot? Is there a guide?
@S-A-Martin yes and no. it is currently an open pull request for litellm here but its not merged yet. if its merged into main, its automatically integrated into openhands. they are currently at testing so its happening soon(hopefully).
@S-A-Martin yes and no. it is currently an open pull request for litellm here but its not merged yet. if its merged into main, its automatically integrated into openhands. they are currently at testing so its happening soon(hopefully).
and no there is no guide. you can use it via advanced options. if its added you can check how to use it here
you can use litellm as a seperate container and merge it into your own fork. that way you can use your own litellm proxy with the gitub api on openhands(see https://github.com/BerriAI/litellm/tree/litellm_dev_03_05_2025_contributor_prs)
@mamoodi litellm has merged into main. Therefore this issue can be closed
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.
This issue is stale because it has been open for 40 days with no activity. Remove the stale label or leave a comment, otherwise it will be closed in 10 days.
This issue is stale because it has been open for 40 days with no activity. Remove the stale label or leave a comment, otherwise it will be closed in 10 days.