cherry-studio icon indicating copy to clipboard operation
cherry-studio copied to clipboard

[功能]: 支持MCP

Open jinlm opened this issue 1 year ago • 5 comments

您的功能建议是否与某个问题相关?

希望增加对MCP的支持,以扩展AI的能力。 目前在用的客户端中,Librechat刚刚支持了MCP的tools,但不算太好用。 cline的支持蛮好的,但它毕竟是AI编程助手,不是通用的AI客户端。

请描述您希望实现的解决方案

建议参考Claude Desktop和Cline,不要像Librechat一样设置在代理中。 支持tools和资源这两种类型即可。

请描述您考虑过的其他方案

No response

其他补充信息

No response

jinlm avatar Dec 23 '24 16:12 jinlm

+1,希望能对MCP的支持,扩展自定义服务功能将会变得非常便捷。 对MCP的支持也将使 Cherry-Studio 成为与其它类似产品的一个显著特色差异功能的产品。

gotoolkits avatar Dec 24 '24 07:12 gotoolkits

tools

有没有文档,让我参考一下

kangfenmao avatar Dec 24 '24 08:12 kangfenmao

@kangfenmao 可以参考 https://www.claudemcp.com/zh/docs,项目可以参考 https://github.com/mark3labs/mcphost 来写一个MCP客户端逻辑

gotoolkits avatar Dec 24 '24 08:12 gotoolkits

+1 添加MCP支持之后支持多种上下文会非常方便

YanxingLiu avatar Dec 24 '24 10:12 YanxingLiu

tools

有没有文档,让我参考一下

https://modelcontextprotocol.io/ 这是官方文档

jinlm avatar Dec 24 '24 10:12 jinlm

MCP 可以有效扩展app能力的上限,这东西确实很重要

CaiJingLong avatar Jan 09 '25 08:01 CaiJingLong

集成MCP的AI 助手类产品已有出现,可以看看“5ire”,相信Cherry Studio应该也可以更好的实现这块功能,从标准Agent 升级为更为便捷扩展完成复杂任务(MCP)的Agent

gotoolkits avatar Jan 23 '25 02:01 gotoolkits

5ire 实现确实太复杂了,我作为技术开发者都很难学会如何使用

kangfenmao avatar Jan 23 '25 02:01 kangfenmao

The 5ire implementation is really too complex, and it was hard for me as a technical developer to learn how to use it

Their implementation is also, in my opinion, flawed because the tools are all internal/bundled. Something closer to Claude's way is this: https://github.com/danny-avila/LibreChat/pull/5015.

https://www.librechat.ai/docs/configuration/librechat_yaml/object_structure/mcp_servers

Edit: An alternative would be to integrate support for this one: https://github.com/SecretiveShell/MCP-Bridge On their Discord they advocate for UI integration. Could perhaps be easier, at least as a first step,

kvn1351 avatar Jan 26 '25 20:01 kvn1351

https://github.com/daodao97/chatmcp 看到有一个开源的 MCP 客户端实现,但感觉功能还是少了点

wyih avatar Feb 07 '25 19:02 wyih

+1

JaylanLiu avatar Feb 10 '25 06:02 JaylanLiu

+1

seayuns avatar Feb 10 '25 11:02 seayuns

MCP 里包含了 Resources 、 Prompts 、 Tools 等好几个互不相干的部分,感觉是不是可以不用一步到位,先支持其中一个功能,以后慢慢补全

RyoJerryYu avatar Feb 13 '25 03:02 RyoJerryYu

快支持mcp啊,如果没有精力,可以一起开发

tx991020 avatar Feb 15 '25 12:02 tx991020

可以参考一下代码:

import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { StdioClientTransport, StdioServerParameters } from "@modelcontextprotocol/sdk/client/stdio.js";
import { SSEClientTransport } from "@modelcontextprotocol/sdk/client/sse.js";
import OpenAI from "openai";
import { Tool } from "@modelcontextprotocol/sdk/types.js";
import { ChatCompletionMessageParam } from "openai/resources/chat/completions.js";
import { createInterface } from "readline";
import { homedir } from 'os';

const config = [
  {
    name: 'demo-stdio',
    type: 'command',
    command: 'node ~/mcp/build/demo-stdio.js',
    isOpen: true
  },
  {
    name: 'weather-stdio',
    type: 'command',
    command: 'node ~/mcp/build/weather-stdio.js',
    isOpen: true
  },
  {
    name: 'demo-sse',
    type: 'sse',
    url: 'http://localhost:3001/sse',
    isOpen: false
  }
]



// 初始化环境变量
const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
if (!OPENAI_API_KEY) {
    throw new Error("OPENAI_API_KEY environment variable is required");
}

interface MCPToolResult {
    content: string;
}

interface ServerConfig {
    name: string;
    type: 'command' | 'sse';
    command?: string;
    url?: string;
    isOpen?: boolean;
}

class MCPClient {
    static getOpenServers(): string[] {
        return config.filter(cfg => cfg.isOpen).map(cfg => cfg.name);
    }
    private sessions: Map<string, Client> = new Map();
    private transports: Map<string, StdioClientTransport | SSEClientTransport> = new Map();
    private openai: OpenAI;

    constructor() {
        this.openai = new OpenAI({
            apiKey: OPENAI_API_KEY
        });
    }

    async connectToServer(serverName: string): Promise<void> {
        const serverConfig = config.find(cfg => cfg.name === serverName) as ServerConfig;
        if (!serverConfig) {
            throw new Error(`Server configuration not found for: ${serverName}`);
        }

        let transport: StdioClientTransport | SSEClientTransport;
        if (serverConfig.type === 'command' && serverConfig.command) {
            transport = await this.createCommandTransport(serverConfig.command);
        } else if (serverConfig.type === 'sse' && serverConfig.url) {
            transport = await this.createSSETransport(serverConfig.url);
        } else {
            throw new Error(`Invalid server configuration for: ${serverName}`);
        }

        const client = new Client(
            {
                name: "mcp-client",
                version: "1.0.0"
            },
            {
                capabilities: {
                    prompts: {},
                    resources: {},
                    tools: {}
                }
            }
        );

        await client.connect(transport);
        
        this.sessions.set(serverName, client);
        this.transports.set(serverName, transport);

        // 列出可用工具
        const response = await client.listTools();
        console.log(`\nConnected to server '${serverName}' with tools:`, response.tools.map((tool: Tool) => tool.name));
    }

    private async createCommandTransport(shell: string): Promise<StdioClientTransport> {
        const [command, ...shellArgs] = shell.split(' ');
        if (!command) {
            throw new Error("Invalid shell command");
        }

        // 处理参数中的波浪号路径
        const args = shellArgs.map(arg => {
            if (arg.startsWith('~/')) {
                return arg.replace('~', homedir());
            }
            return arg;
        });
        
        const serverParams: StdioServerParameters = {
            command,
            args,
            env: Object.fromEntries(
                Object.entries(process.env).filter(([_, v]) => v !== undefined)
            ) as Record<string, string>
        };

        return new StdioClientTransport(serverParams);
    }

    private async createSSETransport(url: string): Promise<SSEClientTransport> {
        return new SSEClientTransport(new URL(url));
    }

    async processQuery(query: string): Promise<string> {
        if (this.sessions.size === 0) {
            throw new Error("Not connected to any server");
        }

        const messages: ChatCompletionMessageParam[] = [
            {
                role: "user",
                content: query
            }
        ];

        // 获取所有服务器的工具列表
        const availableTools: any[] = [];
        for (const [serverName, session] of this.sessions) {
            const response = await session.listTools();
            const tools = response.tools.map((tool: Tool) => ({
                type: "function" as const,
                function: {
                    name: `${serverName}__${tool.name}`,
                    description: `[${serverName}] ${tool.description}`,
                    parameters: tool.inputSchema
                }
            }));
            availableTools.push(...tools);
        }

        // 调用OpenAI API
        const completion = await this.openai.chat.completions.create({
            model: "gpt-4-turbo-preview",
            messages,
            tools: availableTools,
            tool_choice: "auto"
        });

        const finalText: string[] = [];
        
        // 处理OpenAI的响应
        for (const choice of completion.choices) {
            const message = choice.message;
            
            if (message.content) {
                finalText.push(message.content);
            }

            if (message.tool_calls) {
                for (const toolCall of message.tool_calls) {
                    const [serverName, toolName] = toolCall.function.name.split('__');
                    const session = this.sessions.get(serverName);
                    
                    if (!session) {
                        finalText.push(`[Error: Server ${serverName} not found]`);
                        continue;
                    }

                    const toolArgs = JSON.parse(toolCall.function.arguments);

                    // 执行工具调用
                    const result = await session.callTool({
                        name: toolName,
                        arguments: toolArgs
                    });

                    const toolResult = result as unknown as MCPToolResult;
                    finalText.push(`[Calling tool ${toolName} on server ${serverName} with args ${JSON.stringify(toolArgs)}]`);
                    console.log(toolResult.content);
                    finalText.push(toolResult.content);

                    // 继续与工具结果的对话
                    messages.push({
                        role: "assistant",
                        content: "",
                        tool_calls: [toolCall]
                    });
                    messages.push({
                        role: "tool",
                        tool_call_id: toolCall.id,
                        content: toolResult.content
                    });

                    // 获取下一个响应
                    const nextCompletion = await this.openai.chat.completions.create({
                        model: "gpt-4-turbo-preview",
                        messages,
                        tools: availableTools,
                        tool_choice: "auto"
                    });

                    if (nextCompletion.choices[0].message.content) {
                        finalText.push(nextCompletion.choices[0].message.content);
                    }
                }
            }
        }

        return finalText.join("\n");
    }

    async chatLoop(): Promise<void> {
        console.log("\nMCP Client Started!");
        console.log("Type your queries or 'quit' to exit.");

        const readline = createInterface({
            input: process.stdin,
            output: process.stdout
        });

        const askQuestion = () => {
            return new Promise<string>((resolve) => {
                readline.question("\nQuery: ", resolve);
            });
        };

        try {
            while (true) {
                const query = (await askQuestion()).trim();

                if (query.toLowerCase() === 'quit') {
                    break;
                }

                try {
                    const response = await this.processQuery(query);
                    console.log("\n" + response);
                } catch (error) {
                    console.error("\nError:", error);
                }
            }
        } finally {
            readline.close();
        }
    }

    async cleanup(): Promise<void> {
        for (const transport of this.transports.values()) {
            await transport.close();
        }
        this.transports.clear();
        this.sessions.clear();
    }

    hasActiveSessions(): boolean {
        return this.sessions.size > 0;
    }
}

// 主函数
async function main() {
    const openServers = MCPClient.getOpenServers();
    console.log("Connecting to servers:", openServers.join(", "));
    const client = new MCPClient();
    
    try {
        // 连接所有开启的服务器
        for (const serverName of openServers) {
            try {
                await client.connectToServer(serverName);
            } catch (error) {
                console.error(`Failed to connect to server '${serverName}':`, error);
            }
        }

        if (!client.hasActiveSessions()) {
            throw new Error("Failed to connect to any server");
        }

        await client.chatLoop();
    } finally {
        await client.cleanup();
    }
}

// 运行主函数
main().catch(console.error);

yufeng201 avatar Feb 18 '25 10:02 yufeng201

有任何计划或者路线图吗?

WAY29 avatar Feb 18 '25 15:02 WAY29

是不是可以参考一下https://github.com/daodao97/chatmcphttps://5ire.app/这两个开源的Client?支持MCP-Tools之后,最基础的网页搜索、fetch网页都有了,不需要再依靠火山引擎来搭建网络搜索Bot了,巨大功能提升。还有无限的想象力。

wangfh5 avatar Feb 21 '25 02:02 wangfh5

https://github.com/daodao97/chatmcp和https://5ire.app/ 这两个都有很大bug ,mysql都连不上

tx991020 avatar Feb 23 '25 08:02 tx991020

目前用下来Cline,Roo Code和goose是最舒服的。Cline和Roo Code是啥模型都能function call,应该有自己自研的成分;goose是需要模型支持function calling的,比如gemini的实验模型就无法使用。

wangfh5 avatar Feb 23 '25 12:02 wangfh5

目前用下来Cline,Roo Code和goose是最舒服的。Cline和Roo Code是啥模型都能function call,应该有自己自研的成分;goose是需要模型支持function calling的,比如gemini的实验模型就无法使用。

Cline/Roo-Code应该是有prompt做fallback

WAY29 avatar Feb 23 '25 12:02 WAY29

Cline/Roo-Code就是完全利用Prompt来实现MCP的,避免了OpenAI和Claude等模型的差异,并且可以让不支持FunctionCalling的模型也可以支持MCP。

wen313 avatar Feb 24 '25 02:02 wen313

加油加油,早日支持MCP!

YangZyyyy avatar Feb 25 '25 15:02 YangZyyyy

期待

wb-hwang avatar Feb 27 '25 08:02 wb-hwang

5ire 实现确实太复杂了,我作为技术开发者都很难学会如何使用

让佬提交pr,哪位佬搞一下

sunqb avatar Mar 05 '25 07:03 sunqb

目前main分支已经支持mcp,还在测试中

eeee0717 avatar Mar 06 '25 07:03 eeee0717

@eeee0717 需要为每个mcp server增加一个选项,即超时时间。有些mcp server返回时间很长,比如说deepresearch。具体可以参考cline的实现。https://github.com/cline/cline/pull/2018

duanhongyi avatar Mar 06 '25 08:03 duanhongyi

帮你 @vaayne 大佬,mcp是他在搞

eeee0717 avatar Mar 06 '25 08:03 eeee0717

感觉也可以参考下这个支持MCP的客户端 HyperChat,比较低调,基础功能没有cherry精致,但是MCP的功能目前用过的是最好的:

  1. 支持添加自定义MCP(唯一缺点是只支持stdio,不支持sse)
  2. 支持创建agent勾选特定MCP,这点非常重要,可以根据不同场景选择不同工具集,不然MCP加多了浪费token,而且有时模型会选不对工具
  3. 工具调用相当稳定,过程和结果比较直观
  4. 还有一些别的比较酷的功能,比如可以把自定义的agent自己也作为一个工具,让别的agent来调

yarray avatar Mar 07 '25 04:03 yarray

感觉也可以参考下这个支持MCP的客户端 HyperChat,比较低调,基础功能没有cherry精致,但是MCP的功能目前用过的是最好的:

  1. 支持添加自定义MCP(唯一缺点是只支持stdio,不支持sse)
  2. 支持创建agent勾选特定MCP,这点非常重要,可以根据不同场景选择不同工具集,不然MCP加多了浪费token,而且有时模型会选不对工具
  3. 工具调用相当稳定,过程和结果比较直观
  4. 还有一些别的比较酷的功能,比如可以把自定义的agent自己也作为一个工具,让别的agent来调

@yarray 很好的建议,现在 cherry 已经支持了 stdio 和 sse,支持聊天界面显示工具的调用情况 正在做的有支持在聊天界面选择启用哪些 mcp servers,将来也会支持给各个助手设置启动哪些 servers

vaayne avatar Mar 07 '25 04:03 vaayne