opencode
opencode copied to clipboard
Load MCP in parallel and cache MCP tools
The remote MCP servers are painfully slow to load, especially since I am currently in China and coding over VPN.
I noticed that loading sequentially doesn't help. After conducting a few tests, I found that the initial client was the slowest.
Test Results
Config
"mcp": {
"context7": {
"type": "remote",
"url": "https://mcp.context7.com/sse"
},
"coin": {
"type": "remote",
"url": "https://mcp.api.coingecko.com/sse"
},
"fetch": {
"type": "remote",
"url": "https://remote.mcpservers.org/fetch/mcp"
}
}
Below are the results from my simple benchmark of three runs:
Original implementation:
- Clients load: 8436ms - 12622ms
- Tools load (per message): 1192ms - 1387ms
Parallel client and tool loading:
- Clients load: 2437ms - 3412ms
- Tools load (per message): 367ms - 569ms
After caching:
- Clients and tools load together
- No additional loading per message
- Drawback: Need to restart opencode on MCP server changes but as it is API I assume it will change very infrequently.