Support for other LLM's rather than Sonnet?
Is there any support to use mcp not just on Sonnet but other LLM's, such as Llama or GPT
I guess one would need to write a compatibility layer. Was able to get it working with local llms using https://github.com/patruff/ollama-mcp-bridge
should be Gemini 2.5 also
This library is for building MCP servers and clients, and apart from the sampling feature (which is LLM agnostic), is not concerned with integrating with specific agent libraries. I expect that you will see MCP support added to respective agent libraries as time goes on!
@jlowin quick question - What is the best way to integrate fastmcp in a production environment, with open-webui and ollama? I have a container for open-webui, another for ollama and i'm wondering if I can add another container for fastmcp or add it to the open-webui container. Thanks in advance.