vscode icon indicating copy to clipboard operation
vscode copied to clipboard

Open-source AI functionality provided by the Copilot Chat extension

Open kieferrm opened this issue 7 months ago • 9 comments

Our blog post outlines our motivation to open-source the client-side code of our AI features in VS Code. We also compiled FAQs.

Goals

  1. Open source is only useful, if you can participate in the development process of the AI features. We need a development story that allows you to make code changes and debug AI interactions end-to-end. You need to be able to run the AI tests suites. Since all AI features are powered by models, you need to have access to models during development.
  2. Once open source, we re-evaluate how we split the functionality between VS Code Core, built-in extension(s), and the Chat extension. We want to improve the user experience and simplify our architecture and build processes.

Approach

We'll first open-source the GitHub Copilot Chat extension. To do so we need:

  1. Ensure code compliance
  2. Define the strategy for service access
  3. Define how to run tests
  4. Define OSS builds
  5. Issue management

Note: Today, NES functionality is separate from code completions. NES is implemented in the Chat extension, while completions are implemented in the GitHub Copilot completions extension. We have concrete plans to bring NES and completions together in the Chat extension. Therefore, at the moment, we don't have concrete plans to open-source the Copilot completions extension.

Compliance Review

  • We need to review every file in the Copilot Chat extension for compliance. This includes adding copyrights and removing references to internal processes, IP, and issues. This is particularly important for our test suite that contains test cases created with information from private issues.
  • After the review, we'll move the Chat extension code to a new repository without history avoiding the need to review thousands of commits.

Service Access

  • The Chat extension is powered by the GitHub Copilot service. The GitHub Copilot service provides access to general purpose and custom models, embeddings computation, and semantic code search of GitHub repositories.
  • To talk to the GitHub Copilot service, the Chat extension uses CAPI (the GitHub Copilot API). Just like our other production services, for example the settings sync service, the Copilot service will remain closed source, and its usage will continue to be regulated by its service license.
  • For debug AI interactions, you need to be able to run Code-OSS with the Chat extension installed. Normally, Code-OSS does not have access to production services. This is not a handicap for non-AI features, but AI features are useless without model access. Our current thinking is that we'll provide a closed-source, licensed npm module providing CAPI access that you can choose to install into the codebase before launching Code-OSS. Or you can use BYOK without CAPI for limited scenarios.

Tests

  • We built a test infrastructure that deals with the stochastic nature of LLMs and makes heavy use of caching LLM responses for given prompts. If you make a change in the code that results in a prompt change for a specific scenario, you want to only issue LLM requests for the changed prompts and use the cached LLM responses in all other cases. The cache is implemented using Redis. We need to allow read-only access to the Redis cache which in MS terminology makes the Redis cache a production service. We therefore need to go through the motions of creating a new production service.
  • We need to investigate if we can use PR submissions for cache baseline updates.

Builds

  • We'll need to define what PR builds look like for the Chat extension.

Issues

  • Today, issues for AI features are in three different repositories: microsoft/vscode, microsoft/vscode-copilot-release, and the private repository we use(d) for developing the Chat extension.
  • Going forward, all client issues should be in microsoft/vscode.
  • We'll move only select issues from the private repo into the public repo.
  • We'll archive/lock the microsoft/vscode-copilot-release repo, so that no new issues can be created there and existing issues are locked. The issues will continue to be accessible.
  • We need a clearer separation of client issues from service issues. We have a large number of service issues, particularly in the microsoft/vscode-copilot-release repo that are not actionable and have no clear path to being closeable.

kieferrm avatar May 15 '25 14:05 kieferrm

@kieferrm is agent mode also part of the open source plan?

ups216 avatar May 19 '25 17:05 ups216

@kieferrm : Is the Visual Studio 2022 Github Copilot also be open sourced?

dengyakui avatar May 19 '25 17:05 dengyakui

Great work

iwangbowen avatar May 20 '25 00:05 iwangbowen

About https://github.com/microsoft/vscode/issues/249031#issuecomment-2891711609. @ups216 yes, everything but ghost text completions. See the note in the issue description.

kieferrm avatar May 20 '25 02:05 kieferrm

Out of curiosity, as I'm not proficient in VSCode's architecture, what's the value in moving Copilot from being an extension to being part of the core codebase as opposed to a built-in extension as mentioned here?

In fact, many core features of VS Code are built as extensions and use the same Extension API.

elongl avatar May 20 '25 06:05 elongl

Is there going to be an option to be opted-out by default or outright disable the AI features for people who do not wish to use it? Thats currently something I prefer with the extension and I'm sure a great deal more would like. My experience with copilot was subpar and I would certainly not want to use it even if it was integrated.

NaokiS28 avatar May 20 '25 12:05 NaokiS28

Can users skip using CAPI and just bring their own models or API services that the Chat can talk to?

htahir1 avatar May 20 '25 18:05 htahir1

Great!Support any LLMs that comply with the OpenAI specifications?

hu-qi avatar May 21 '25 01:05 hu-qi

Looking forward to this!

trancethehuman avatar May 21 '25 21:05 trancethehuman

Any updates on open sourcing the extension?

VikashLoomba avatar May 28 '25 20:05 VikashLoomba

Will it only use public vscode apis? Would be fair so other AI extensions can provide a good experience as well.

fabb avatar May 30 '25 06:05 fabb

Any updates on open sourcing the extension?

quintos955 avatar Jun 04 '25 00:06 quintos955

This will only give it more customization and other extensions that cannot be used, allowing more people to help you maintain this github copilot closed ai service. The function should be opened to extensions and then github copilot extensions is called, instead of putting github copilot into vscode. This is fake open source, which uses the ability of vscode not to open to extension to suppress other ai extensions.

How might this affect products like Windsurf and Cursor that are built on VS-Code and are competitors to GitHub CoPilot?

kirkilj avatar Jun 07 '25 14:06 kirkilj

Waiting for this update.

LakshmanKishore avatar Jun 18 '25 11:06 LakshmanKishore

it will open in 2049

kdlslyv avatar Jun 27 '25 10:06 kdlslyv

Check out https://github.com/microsoft/vscode-copilot-chat

tekumara avatar Jun 28 '25 07:06 tekumara

Without InlineCompletionItem this might as well be a glorified vibecoded webview chatwindow

kdlslyv avatar Jun 28 '25 10:06 kdlslyv

Without InlineCompletionItem this might as well be a glorified vibecoded webview chatwindow

The InlineCompletion will be merged to the chat plugin, is on the roadmap.

ritamariavermelho06 avatar Jul 01 '25 18:07 ritamariavermelho06

Should this issue be closed since first milestone is achieved?

trivikr avatar Jul 01 '25 21:07 trivikr

@ritamariavermelho06 Will the inline completion feature open-source released in GitHub Copilot Chat completely the same as in closed source GitHub Copilot?

pai4451 avatar Jul 12 '25 03:07 pai4451