Proposal: MCP Support
It can be good to implement MCP Client protocol to use existing tools
DOCS: https://modelcontextprotocol.io
Other link: https://github.com/punkpeye/awesome-mcp-servers
Do you want to support first stdio or sse? Do you start to work on it?
I won't do it, Alpaca already has lots of tooling integrated with the UI, I'm pretty much a solo developer and don't see a reason why I should throw away my work just to do the same but now with yet another library to worry about
and don't see a reason why I should throw away my work just to do the same but now with yet another library to worry about
MCP isn't a library but a Protocol. It's a Protocol for giving LLMs Context and more and more tools support giving context through MCP Also some LLM Moddels are now specifically trained to work with MCP.
But I can understand that throwing the current Features just Away on a Hobby Project probably isn't the right decision.
I personally would see this as something that should be implemented on Ollama level (https://github.com/ollama/ollama/issues/7865), and then after the protocol gained enough traction it could be maybe be implemented?
MCP is basicly LSP but for LLMs It seems most likely that it will the primary protocol for givin LLMs Context
I understand the reluctance and everyone and every project is free to do as they wish, but I don't believe it's an overstatement to say that not supporting MCP is a mistake that may eventually end up in Alpaca being obsoleted by a tool that does. Much of the AI scene is converging on MCP right now, as, as mentioned above already, it is a new standardized protocol much like the LSP (Language Server Protocol) to let AI providers/'hosts', LLMs and tools (servers) interact.
Instead of building tools for an application such as Alpaca they will be instead be built for MCP, meaning that they will also be reusable with other providers. We can expect libraries of these tools to emerge that allow downloading and plugging them in to your favourite application and model (hopefully including Alpaca in the future).
As a concrete example, @bilelmoussaoui has recently posted an example GNOME MCP server that provides tools to manipulate the GNOME desktop. By integrating MCP, one could use Alpaca and this tool to ask e.g. 'Start the Firefox application' (either textually or using Speech-To-Text on top), and that is just the beginning - proprietary tools such as Cursor.ai already have their own tools that can add context of GitHub issues, Figma designs, and more, that will eventually come to MCP as well. It opens a whole library of potential integrations.
@Jeffser Please reconsider. Alpaca is the best native GUI chat client for Ubuntu, but the lack of MCP support makes it practically unusable for me.
So what do you want? A server? A client? Both?
Just for tools or the whole thing?
If alpaca where to include MCP it would have to ship the ollama MCP Server i've linke$ above, and then on one hand of course connect it to its ollama instance and in the other hand expose it to third party programs.
It could of course also rewrite some of its existing context features to use the MCP instead, or it could only be used for builtins for new features.
If somebody uses an external ollama though, they would need to manage the MCP Server for it themselves and give alpaca acces to it.
But I guess the main target should stay the internal ollama instance
Defenetly not a small task
Hi! I believe the MCP topic should be divided into two streams:
- MCP client in Alpaca - this allows for massive tools reuse from internet or from local servers (network or stdio)
- MCP server - this defines how to use Alpaca from other apps
I believe as Alpaca is a local companion app then MCP client should be a priority, as it would allow for broader tool usage for users.