llm icon indicating copy to clipboard operation
llm copied to clipboard

Feature request: ability to use MCP servers

Open davidmoshal opened this issue 11 months ago • 17 comments

ability to use MCP Servers?

davidmoshal avatar Jan 06 '25 08:01 davidmoshal

I think this would be a very interesting feature. There's a nifty little project called fastmcp (https://github.com/jlowin/fast mcp). Think fastapi, but allows your llm to access external functionality, eg you could create an mcp server to connect to your email, then on the command line you could ask llm to search/send email etc

Currently mcp is partially implemented in claude desktop, but doing so in llm would allow for complete integration and ability to use other open source models. Think of it as a standardised way of implementing function calling.

lingster avatar Jan 13 '25 04:01 lingster

shouldn't be too hard, with FastMCP, no?

davidmoshal avatar Jan 13 '25 21:01 davidmoshal

Related / required: https://github.com/simonw/llm/issues/607

luebken avatar Jan 22 '25 16:01 luebken

Corrected link for fastmcp https://github.com/jlowin/fastmcp

bsima avatar Jan 27 '25 17:01 bsima

Spotted this nice CLI that you can configure to use MCP servers!

davidgasquez avatar Jan 31 '25 18:01 davidgasquez

Spotted this nice CLI that you can configure to use MCP servers!

Interesting that this is also named llm

nbbaier avatar Feb 14 '25 05:02 nbbaier

What, does MCP stand for?

Lewiscowles1986 avatar Feb 15 '25 02:02 Lewiscowles1986

Right Model Context Protocol https://github.com/modelcontextprotocol

Lewiscowles1986 avatar Feb 15 '25 02:02 Lewiscowles1986

Are you considering implementing MCP @simonw ?

ChristianWeyer avatar Mar 14 '25 16:03 ChristianWeyer

FastMCP is now the official MCP SDK: https://github.com/modelcontextprotocol/python-sdk

But yeah, this would be super cool. I've recently seen just how powerful and useful MCP can be and would love to play with it more, but I don't want to have to use Claude Desktop to do it. (It's not available on Linux, anyway.) Most of my AI queries these days are via llm anyway.

I wouldn't mind contributing to this integration, I'm just not familiar enough yet with the inner workings of llm and the plugin system

notdaniel avatar Mar 18 '25 22:03 notdaniel

so I've actually written my own cli mcp tool using typer and python and it's open sourced here: https://github.com/lingster/mcp-llm only supports claude for now, but PRs or suggestions for improvements welcome!

lingster avatar Mar 21 '25 08:03 lingster

Spotted this nice CLI that you can configure to use MCP servers!

This CLI uses langchain under the hood. So it should work for any model where the langchain folks support tool calling. This support is mostly about knowing how to tell the model about the available tools, e.g.

e.g. look at this code:

https://github.com/langchain-ai/langchain/blob/3848a1371da1a5bc6a4b0fcae461afcf529dc328/libs/partners/anthropic/langchain_anthropic/chat_models.py#L87

And compare it with the first code listing on this page: https://docs.anthropic.com/en/docs/build-with-claude/tool-use/overview

rahimnathwani avatar Mar 23 '25 04:03 rahimnathwani

I plan to implement MCP support as a plugin on top of tools, once that lands:

  • https://github.com/simonw/llm/issues/898

simonw avatar Apr 08 '25 15:04 simonw

Hi @simonw does that mean that you plan to implement MCP Client capabilities as a plugin? Is there any issue to track that work, any planned design? Would love to help if I can.

danielmeppiel avatar Jun 26 '25 06:06 danielmeppiel

@danielmeppiel I created the plugin for this, added PR to list this plugin in docs #1261. Will be nice if you give it try

eugenepyvovarov avatar Aug 31 '25 16:08 eugenepyvovarov

@simonw Any plans on adding the above MCP support in the cli tool? It seems like it would be extremely useful as generic tools via schema are not best practice nowadays.

hasani114 avatar Sep 26 '25 11:09 hasani114