opencode icon indicating copy to clipboard operation
opencode copied to clipboard

MCP Resources and Prompts

Open pattobrien opened this issue 5 months ago • 28 comments

MCP Resources and Prompts are being implemented by various popular AI IDEs and CLI tools (e.g. VSCode Copilot, Claude Code, Claude Desktop). I would point to VSCode specifically as my personal favorite implementation of both MCP features.

Some notable use cases under this (rather large) feature umbrella could be:

  • allow Prompts to be triggered via slash-commands
  • allow Resources to be attached to context window via similar slash-commands (in VSCode it uses "@" prefix, rather than "/" for resources)
  • display all attached Resources as e.g. chip components in the chat UI, with the resource name/title
  • allow Tool results to add Resources to the context window

I'm quite familiar with the MCP spec, as well as its implementation in the major MCP clients, so let me know what other details you'd like - more than happy to help.

pattobrien avatar Jul 09 '25 18:07 pattobrien

i think the idea is great ❤️ however, how about agent-zero style, where we could have a "MCP creator" agent, with single- / multi-shot prompt, that could dynamically create local MCP servers? and for context, it should invoke a "web search agent" (configurable please, i run SearXNG docker / podman container, no apikey required) that searches for relevant info on how to implement an API or CLI tool call, and then pass the compiled context back to the MCP creator agent, which then writes the MCP config file and sends the tool definition to the parent agent.

mchzimm avatar Jul 10 '25 06:07 mchzimm

I would also love this. Absolutely underrated Claude code feature imo is including MCP resources in rules. Syntax is simple, and it allows remote updating rules for companies or open source projects. Pretty neat!

Example we have in our rules:

## WordPress
- Use @wordpress-dev-docs:wordpress://rules/base as your wordpress base coding rule
- Use @wordpress-dev-docs:wordpress://rules/quality_assurance_phpstan as guideline when working with phpstan in wordpress context
- Use @wordpress-dev-docs:wordpress://rules/quries as guideline when creating wordpress queries

Doc Reference: https://docs.anthropic.com/en/docs/claude-code/mcp#use-mcp-resources

JUVOJustin avatar Jul 11 '25 08:07 JUVOJustin

+1 to this, would make opencode immensely more usable for folks whose dev workflows are often guided by custom prompts/commands

VikashLoomba avatar Jul 11 '25 23:07 VikashLoomba

I'd love to have the support for prompts and resources. ❤️

I'd +1 the slash-command for prompts and at-sign for resource reference.

I'm making a MCP server for storing prompts and I have noticed that woefully few MCP client/host implementations support those capabilities. I'd be very happy if I could use my MCP server one day with Opencode. In any case it would definitely open new possibilities for Opencode as well.

HKiOnline avatar Jul 22 '25 07:07 HKiOnline

Huge +1 here.

My team has created an internal MCP server that we're using to share resources, prompts, and tools for all our devs under a single, authenticated, HTTP endpoint. We don't want to commit "common prompts" into all our repositories. We don't want to make devs copy/paste large prompts from some Confluence page. Exposing them via an MCP has worked well.

elementalvoid avatar Aug 05 '25 18:08 elementalvoid

I have a personal mcp server just for the purpose of sharing prompts across different tools.

would be great if opencode could have those as slash commands the same way avante.nvim and other do

smissingham avatar Aug 08 '25 01:08 smissingham

+1

vfarcic avatar Aug 10 '25 00:08 vfarcic

+1

nelspir avatar Aug 14 '25 16:08 nelspir

Slash commands should be coming sometime this week: https://x.com/thdxr/status/1955718582639251488

rekram1-node avatar Aug 14 '25 16:08 rekram1-node

Slash commands look excellent, but sadly don't seem to work via MCP yet.

Great foundation though, shouldn't be too hard to have MCP prompts plugged into this feature

https://opencode.ai/docs/commands/

In the meantime, I'm symlinking my prompts collection all over the place 😅

smissingham avatar Aug 23 '25 12:08 smissingham

+1

adam-paterson avatar Sep 17 '25 14:09 adam-paterson

+1

john-persson-polestar avatar Sep 18 '25 20:09 john-persson-polestar

+1

nelspir avatar Sep 19 '25 16:09 nelspir

those who have recently reacted are those +1s specifically for mcp prompts?

rekram1-node avatar Sep 19 '25 16:09 rekram1-node

those who have recently reacted are those +1s specifically for mcp prompts?

Yeah, prompts for MeV

adam-paterson avatar Sep 20 '25 22:09 adam-paterson

those who have recently reacted are those +1s specifically for mcp prompts?

Yes. I recently added a prompt to my MCP server and I'm not even sure how I would call it from opencode without slash commands. I can see the prompt being available in the MCP inspector and as a slash command in Calude Code, but when I ask for it in opencode (Claude Sonnet 4) it can't find it:

opencode:

Image

Claude Code:

Image

MCP Inspector:

Image

john-persson-polestar avatar Sep 21 '25 08:09 john-persson-polestar

I think this can be best addressed once vercel ai sdk adds support for this: https://github.com/vercel/ai/issues/6294

rekram1-node avatar Sep 21 '25 20:09 rekram1-node

@rekram1-node if I could push back here a little bit - it doesn't really make sense for the AI SDK to implement "prompt" support like you are suggesting. Prompts are generally exposed as slash commands or parameterized forms meant for the user to initiate and fill in, rather than something for the underlying LLM/AI to interact with. Using the mcp typescript SDK, the way that prompts should be exposed in OpenCode should be similar to how its done in Claude Code, where typing "/" makes a ListPromptsRequest for every connected MCP server and adds them to the dropdown of commands. Basically, the LLM isn't given a list of prompts like its given a list of tools - a user chooses and fills in a prompt template, which gets sent to the LLM.

VikashLoomba avatar Sep 22 '25 20:09 VikashLoomba

@VikashLoomba we are talking about mcp server supplied prompts, we use the experimental_createMCPClient from vercel ai sdk. I am not suggesting the prompts be given to the llm. But we use the mcp client from the vercel ai sdk which only returns tools rn it won't grab prompts

Agree on:

Prompts are generally exposed as slash commands or parameterized forms meant for the user to initiate and fill in, rather than something for the underlying LLM/AI to interact with

Maybe I wasn't being clear here but we are in agreement

rekram1-node avatar Sep 22 '25 20:09 rekram1-node

@VikashLoomba we are talking about mcp server supplied prompts, we use the experimental_createMCPClient from vercel ai sdk. I am not suggesting the prompts be given to the llm. But we use the mcp client from the vercel ai sdk which only returns tools rn it won't grab prompts

Agree on:

Prompts are generally exposed as slash commands or parameterized forms meant for the user to initiate and fill in, rather than something for the underlying LLM/AI to interact with

Maybe I wasn't being clear here but we are in agreement

Got it, I probably should have been more clear in my response of what I was alluding to, which was that we could use the underlying transport that's passed to the experimental_createMCPClient to send a message to the server for listing the prompts instead of waiting for the AI SDK to expose prompts in their client implementation.

VikashLoomba avatar Sep 22 '25 20:09 VikashLoomba

yeah true

rekram1-node avatar Sep 22 '25 21:09 rekram1-node

is there any chance we could expose the mcpclient so that plugin developers could pass a call to it?

kziemski avatar Nov 19 '25 15:11 kziemski

Yeah we can expose that

rekram1-node avatar Nov 19 '25 16:11 rekram1-node

Thank you man if were able to interact with the client in the plugin it will provide an outlet for some of the mcp servers out there are utilizing more of the spec than just the tools.

kziemski avatar Nov 20 '25 19:11 kziemski

I think this can be best addressed once vercel ai sdk adds support for this: vercel/ai#6294

It appears that the vercel ai sdk functionality has been implemented. Do you plan to make the mcp prompt and resource usable by opencode?

luifr10 avatar Nov 25 '25 16:11 luifr10

I have a PR ready to implement prompts in opencode...my implementation

  1. fetches the prompts when the MCP server connects
  2. add each prompt as a command like /mcp.servername.promptname
  3. fetches the prompt when the user actually sends it and does the replacement just like another command

I've tested it locally, and it works wonderfully (and I plan to open a PR for resources later on).

Can I just go ahead and open the PR?

paoloricciuti avatar Dec 12 '25 16:12 paoloricciuti

@paoloricciuti do you have the repo somewhere i was hopping to test something with prompts and resources?

kziemski avatar Dec 12 '25 17:12 kziemski

@kziemski here's the branch on my fork...ready to open a PR whenever

https://github.com/paoloricciuti/opencode/tree/mcp-prompts

And this is how it looks like using it:

https://github.com/user-attachments/assets/e2cec744-1383-4de6-a86b-889b3186f32d

paoloricciuti avatar Dec 12 '25 18:12 paoloricciuti