[Feature Request] Replace LLM instruction with an MCP server
Hey, thanks for the great framework — love building APIs with Encore!
Just wondering if you've considered providing an MCP server with Encore docs and examples. This way, we could avoid including large LLM instructions (~11K tokens) in the system prompt. Agents within MCP-compatible clients such as Cursor could dynamically fetch the necessary docs, significantly minimizing (or even completely eliminating) lengthy LLM instructions, as MCP effectively handles most use cases.
For reference, I highly recommend checking out Mastra's MCP docs server configuration, which offers a super smooth experience!
Agreed this would be great, we're going to do it as soon as we can.
@mozharovsky recommend using the context7 mcp server It has the docs for encore