Feature request: ability to use MCP servers
ability to use MCP Servers?
I think this would be a very interesting feature. There's a nifty little project called fastmcp (https://github.com/jlowin/fast mcp). Think fastapi, but allows your llm to access external functionality, eg you could create an mcp server to connect to your email, then on the command line you could ask llm to search/send email etc
Currently mcp is partially implemented in claude desktop, but doing so in llm would allow for complete integration and ability to use other open source models. Think of it as a standardised way of implementing function calling.
shouldn't be too hard, with FastMCP, no?
Related / required: https://github.com/simonw/llm/issues/607
Corrected link for fastmcp https://github.com/jlowin/fastmcp
Spotted this nice CLI that you can configure to use MCP servers!
Spotted this nice CLI that you can configure to use MCP servers!
Interesting that this is also named llm
What, does MCP stand for?
Right Model Context Protocol https://github.com/modelcontextprotocol
Are you considering implementing MCP @simonw ?
FastMCP is now the official MCP SDK: https://github.com/modelcontextprotocol/python-sdk
But yeah, this would be super cool. I've recently seen just how powerful and useful MCP can be and would love to play with it more, but I don't want to have to use Claude Desktop to do it. (It's not available on Linux, anyway.) Most of my AI queries these days are via llm anyway.
I wouldn't mind contributing to this integration, I'm just not familiar enough yet with the inner workings of llm and the plugin system
so I've actually written my own cli mcp tool using typer and python and it's open sourced here: https://github.com/lingster/mcp-llm only supports claude for now, but PRs or suggestions for improvements welcome!
Spotted this nice CLI that you can configure to use MCP servers!
This CLI uses langchain under the hood. So it should work for any model where the langchain folks support tool calling. This support is mostly about knowing how to tell the model about the available tools, e.g.
e.g. look at this code:
https://github.com/langchain-ai/langchain/blob/3848a1371da1a5bc6a4b0fcae461afcf529dc328/libs/partners/anthropic/langchain_anthropic/chat_models.py#L87
And compare it with the first code listing on this page: https://docs.anthropic.com/en/docs/build-with-claude/tool-use/overview
I plan to implement MCP support as a plugin on top of tools, once that lands:
- https://github.com/simonw/llm/issues/898
Hi @simonw does that mean that you plan to implement MCP Client capabilities as a plugin? Is there any issue to track that work, any planned design? Would love to help if I can.
@danielmeppiel I created the plugin for this, added PR to list this plugin in docs #1261. Will be nice if you give it try
@simonw Any plans on adding the above MCP support in the cli tool? It seems like it would be extremely useful as generic tools via schema are not best practice nowadays.