OpenLLM
OpenLLM copied to clipboard
feat: support MCP client
trafficstars
Feature request
I am honestly new to this repo + relatively new to MCP. I have a general understanding of MCP client + MCP server
I made a local setup developing Hello-World (using FastAgent) with MCP client to communicate to MCP server. Currently I am using Anthropic LLM (sonnet) which is hosted on Anthropic side.
I was looking for self-hosted LLM that I could use (with Chat GUI) with MCP servers (toolings) to equip the LLM with extra capabilities and actions.
Not sure if this repo support MCP capabilities OR not.
Motivation
Having self-hosted LLM with MCP capabilities
Other
No response