registry icon indicating copy to clipboard operation
registry copied to clipboard

Allow engine or runtime version to be expressed for local MCP servers

Open joelverhagen opened this issue 6 months ago • 7 comments

Is your feature request related to a problem? Please describe.

Currently, npm and Python-based MCP servers can be marked as such but there is no indication of the runtime version. In an npm package, the package.json can declare an engines node and define what npm and node versions are needed to run the thing. Similarly there is a Requires-Python node in a Python wheel's metadata.

This is not surfaced in the MCP registry metadata, which means that runtime incompatibilities are not handled gracefully.

AFAIK, the best a MCP registry client can do is detect the package type is, say, npm and the user has no npm installed so it can prompt the user to do so. VS Code does this:

Image (tested without MCP server context, just with the manual npm install flow)

Describe the solution you'd like

Runtime information could be optionally expressed in the server node of the OpenAPI definition: https://github.com/modelcontextprotocol/registry/blob/6b22cf09b376ed94c772c69951a2d125e495b26b/docs/openapi.yaml#L165C5-L200

Making it an arbitrary KVP like engines seems like a good start, or we could be more prescriptive on the format per ecosystem.

The client could then enrich the error experience by noticing that the installed npm version is not compatible with the MCP server and, say, select an older version of the MCP server (and be clear about that) or prompt the user to install a newer npm version.

Tooling could be created to init the server.json from a package.json, filling out the mapped engine properties as needed.

The same concept applies to Python and .NET (the area I am looking at specifically).

Describe alternatives you've considered

This could be deferred entirely to the generated single-shot command. So, if npm is not installed, VS Code could prompt to install the latest LTS version (nodejs.org/en/download does this). Then, when npm determines the engines incompatibility after downloading the package, it could provide its own error message.

But this is at least 2 operations ("install MCP server" then "install npm") into the flow and could be confusing to users that are not expert in the node.js/npm ecosystem but want to use that cool MCP server written in JS. If the MCP client has this context, it could provide a more guided experience.

joelverhagen avatar Jun 05 '25 02:06 joelverhagen

This is a good callout, thank you!

Should we consider folding it into within runtime_hint? i.e. turn that into an object (or array of objects) with command+engine pairings?

tadasant avatar Jun 05 '25 14:06 tadasant

@connor4312 - do you have thoughts on this? I think you worked on runtime_hint also.

Maybe we can look at a straw man for each ecosystem:

  • npm example: "runtime_hint": { "command": "npx", "engines": { "npm": ">=10.0.0" }, "os": "linux", "cpu": "x64" }
  • python example: "runtime_hint: { "command": "uvx", "Requires-Python": "3.7" }
  • nuget example: "runtime_hint: { "command": "dnx", "sdk": { "version": "8.0.116" } }

(I may have selected the wrong source of truth, but I hope to illustrate the diversity)

I think there would be two general solution categories.

  1. MCP registry defers the ecosystem-specific complexity to the client tooling, an minimally facilitates this metadata
    • Essentially the command property could be mandated but the other properties are undefined and left to the MCP package author to align on.
    • This has the benefit of not needing to model runtime compatibility in the registry but allows inconsistency in the ecosystem.
  2. MCP registry defines the runtime_hint shape per package type. So there is an NpmRuntimeHint model, PythonRuntimeHint model, etc.

I like the idea of option 2 (force registry entries to align to a specific schema), I just wonder how feasible it is for an initial release since it essentially tries to bring the moving target of "modelling runtime compatibility" (the core of this issue btw 😄). It definitely would make client tooling life simpler if the possible npm vs. python vs. .NET properties are described up front and error handling code can be written against it.

The merit of option 1 is that it carves out a spot for the ecosystem to naturally experiment and figure out what granularity is needed, without trying to upfront the design work.

joelverhagen avatar Jun 05 '25 14:06 joelverhagen

As you said, npm packages already have a engines field, and Pypi has a requires_python field. Dotnet also seems to have this though I am less familiar with that. Docker hub does not as far as I know, though the OCI format is standardized and I think needing version constraints there is super rare.

This is not surfaced in the MCP registry metadata, which means that runtime incompatibilities are not handled gracefully.

Isn't it just up to the client to handle it gracefully? As a client I know a bit about the registry where each is coming from (or at least how to install them) so I could read npm info <package> or pip show <package> and do any necessary version checks ahead of time. Having the info in the MCP registry would save a step but also is a duplicative source of information.

I'm not against this proposal if we do need an out-of-bound source of version information, but if the upstream npm/python/etc registry has this information and can serve as a point of truth I would rather use that.

connor4312 avatar Jun 09 '25 21:06 connor4312

It's a good point that this may be a case where it doesn't serve us to add complexity where the end-result is solely denormalized data in server.json (if we were to include it, it would re-open the can of worms as to all kinds of nice-to-have data from the downstream package registries).

Maybe best left to a "best practices" document that encourages clients' mirrors of the registry to consider pulling down this data for the runtimes and package registries they choose to support.

tadasant avatar Jun 09 '25 21:06 tadasant

My mental model is based on VS Code extensions and NuGet packages where there's a "search"/"browse" with an install button. It is very nice for users to be able to know up front that an install will fail or require additional installation steps before they click the install button. VS Code for example knows the compatibility information of a VS Code extension up front and can behave "smartly" at install time. It can transparently go to older versions if the VS Code engine requirement makes newer versions incompatible. Framework compatibility is a huge pain in NuGet (dependencies, not tools ~= MCP servers to be clear). Install -> Fail -> try another version is an un-fun loop.

I am thinking (could totally be wrong here) that the MCP registry surfacing this information will make it easier for projections to enrich their UI experience and tell the user up front.

Parsing tool specific output like npm info <package> definitely seems like a reasonable approach but that would only happen after the install button is clicked (right?)

I'm imagining the moment in time (or immediately before when they are evaluating the MCP server) when the user clicks "install". How can we guide them to land gracefully with an installed MCP server, even if their environment is not ready to run it.

I agree having the information duplicated feels awkward, but I think duplicating information is the name of the game for MCP registry (if I understand correctly, consumers will read client specific projections of the data, not unlike how MCP registry could duplicate information from the underlying registry but at publish time).

BTW I definitely don't think this is a go-live blocker. I wanted to start the conversation because I anticipate client runtime incompatibility being a top MCP server consumer dissat.

joelverhagen avatar Jun 09 '25 21:06 joelverhagen

This could be enriched by the service that reads the MCP registry to produce their own "compatibility" enrichment automatically, and not require any change from the MCP registry itself.

joelverhagen avatar Jun 09 '25 21:06 joelverhagen

Yea, I understand where you're coming from, and I think it's ultimately a design choice of the registry (https://github.com/modelcontextprotocol/registry/issues/131). From early talks with Anthropic folks, I think VS Code will not directly hit the public registry but will use our own proxy or perhaps one provided by Github, not sur ethe state of things there. And we have the ability to customize that.

If we do this I think a Record<string, string> map with a comment providing the base expected compatibility checks would be the way to go -- basically as you said to mirror package.json engines, pyproject.toml, etc. Just to avoid blowing up schema and having to make schema-level changes for each new package manager.

connor4312 avatar Jun 09 '25 21:06 connor4312