lobe-chat
lobe-chat copied to clipboard
[Request] Claude API已官方支持函数调用,希望开放插件功能
🥰 需求描述
Anthropic 已向开发者告知 Claude API 开始支持函数调用,希望能够开放针对Claude的插件功能。
以下为官方文档地址: https://docs.anthropic.com/claude/docs/tool-use
🧐 解决方案
Claude API 的函数调用接口理论上可以从 OpenAI 的接口迁移过来,希望进行支持。
📝 补充信息
No response
👀 @AliceRabbit
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
🥰 Description of requirements
Anthropic has informed developers that the Claude API begins to support function calls, and hopes to open plug-in functions for Claude.
The following is the official document address: https://docs.anthropic.com/claude/docs/tool-use
🧐 Solution
The function calling interface of Claude API can theoretically be migrated from the OpenAI interface, and we hope to support it.
📝 Supplementary information
No response
我看过了,不完全一样,需要特殊适配的
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
I've seen it, it's not exactly the same, it needs special adaptation.
@arvinxx 这个项目是还在用过时的functions API结构吗? 1106起OpenAI已经开始推行tools API,和Anthropic的API结构应该是一致的
官方示例
import OpenAI from "openai";
const openai = new OpenAI();
async function main() {
const messages = [{"role": "user", "content": "What's the weather like in Boston today?"}];
const tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["location"],
},
}
}
];
const response = await openai.chat.completions.create({
model: "gpt-3.5-turbo",
messages: messages,
tools: tools,
tool_choice: "auto",
});
console.log(response);
}
main();
调用已经改成 tools 了,但消息处理还是 function , 会在 1.0 前改造掉。 相关issue #999
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
The call has been changed to tools, but the message processing is still function, which will be modified before 1.0. Related issue #999
Another bad news: {"type":"error","error":{"type":"invalid_request_error","message":"stream: Tool use is not yet supported in streaming mode."}}
初步改了一下应该差不多这样
diff --git a/package.json b/package.json
index cc4da1f7..076e51c2 100644
--- a/package.json
+++ b/package.json
@@ -79,7 +79,7 @@
},
"dependencies": {
"@ant-design/icons": "^5",
- "@anthropic-ai/sdk": "^0.18.0",
+ "@anthropic-ai/sdk": "^0.20.1",
"@auth/core": "0.28.0",
"@aws-sdk/client-bedrock-runtime": "^3.525.0",
"@azure/openai": "^1.0.0-beta.11",
diff --git a/src/config/modelProviders/anthropic.ts b/src/config/modelProviders/anthropic.ts
index ff5ca82f..10592f4e 100644
--- a/src/config/modelProviders/anthropic.ts
+++ b/src/config/modelProviders/anthropic.ts
@@ -6,6 +6,7 @@ const Anthropic: ModelProviderCard = {
description:
'Ideal balance of intelligence and speed for enterprise workloads. Maximum utility at a lower price, dependable, balanced for scaled deployments',
displayName: 'Claude 3 Sonnet',
+ functionCall: true,
id: 'claude-3-sonnet-20240229',
maxOutput: 4096,
tokens: 200_000,
@@ -15,6 +16,7 @@ const Anthropic: ModelProviderCard = {
description:
'Most powerful model for highly complex tasks. Top-level performance, intelligence, fluency, and understanding',
displayName: 'Claude 3 Opus',
+ functionCall: true,
id: 'claude-3-opus-20240229',
maxOutput: 4096,
tokens: 200_000,
@@ -24,6 +26,7 @@ const Anthropic: ModelProviderCard = {
description:
'Fastest and most compact model for near-instant responsiveness. Quick and accurate targeted performance',
displayName: 'Claude 3 Haiku',
+ functionCall: true,
id: 'claude-3-haiku-20240307',
maxOutput: 4096,
tokens: 200_000,
diff --git a/src/libs/agent-runtime/anthropic/index.ts b/src/libs/agent-runtime/anthropic/index.ts
index 0ab9bf7b..e4134cb0 100644
--- a/src/libs/agent-runtime/anthropic/index.ts
+++ b/src/libs/agent-runtime/anthropic/index.ts
@@ -1,6 +1,7 @@
// sort-imports-ignore
import '@anthropic-ai/sdk/shims/web';
import Anthropic from '@anthropic-ai/sdk';
+import { Tool } from '@anthropic-ai/sdk/resources/beta/tools/messages';
import { AnthropicStream, StreamingTextResponse } from 'ai';
import { ClientOptions } from 'openai';
@@ -20,29 +21,42 @@ const DEFAULT_BASE_URL = 'https://api.anthropic.com';
export class LobeAnthropicAI implements LobeRuntimeAI {
private client: Anthropic;
-
+
baseURL: string;
constructor({ apiKey, baseURL = DEFAULT_BASE_URL }: ClientOptions) {
if (!apiKey) throw AgentRuntimeError.createError(AgentRuntimeErrorType.InvalidAnthropicAPIKey);
-
+
this.client = new Anthropic({ apiKey, baseURL });
this.baseURL = this.client.baseURL;
}
async chat(payload: ChatStreamPayload, options?: ChatCompetitionOptions) {
- const { messages, model, max_tokens, temperature, top_p } = payload;
+ const { messages, model, max_tokens, temperature, top_p, tools } = payload;
const system_message = messages.find((m) => m.role === 'system');
const user_messages = messages.filter((m) => m.role !== 'system');
try {
- const response = await this.client.messages.create({
+ let anthropicTools = new Array<Tool>();
+ if (tools) {
+ for (const tool of tools) {
+ let anthropicTool: Tool = {
+ description: tool.function.description,
+ input_schema: tool.function.parameters as Tool.InputSchema,
+ name: tool.function.name,
+ }
+ anthropicTools.push(anthropicTool);
+ }
+ }
+
+ const response = await this.client.beta.tools.messages.create({
max_tokens: max_tokens || 4096,
messages: buildAnthropicMessages(user_messages),
model: model,
stream: true,
system: system_message?.content as string,
temperature: temperature,
+ tools: anthropicTools,
top_p: top_p,
});
初步改了一下应该差不多这样
diff --git a/package.json b/package.json index cc4da1f7..076e51c2 100644 --- a/package.json +++ b/package.json @@ -79,7 +79,7 @@ }, "dependencies": { "@ant-design/icons": "^5", - "@anthropic-ai/sdk": "^0.18.0", + "@anthropic-ai/sdk": "^0.20.1", "@auth/core": "0.28.0", "@aws-sdk/client-bedrock-runtime": "^3.525.0", "@azure/openai": "^1.0.0-beta.11", diff --git a/src/config/modelProviders/anthropic.ts b/src/config/modelProviders/anthropic.ts index ff5ca82f..10592f4e 100644 --- a/src/config/modelProviders/anthropic.ts +++ b/src/config/modelProviders/anthropic.ts @@ -6,6 +6,7 @@ const Anthropic: ModelProviderCard = { description: 'Ideal balance of intelligence and speed for enterprise workloads. Maximum utility at a lower price, dependable, balanced for scaled deployments', displayName: 'Claude 3 Sonnet', + functionCall: true, id: 'claude-3-sonnet-20240229', maxOutput: 4096, tokens: 200_000, @@ -15,6 +16,7 @@ const Anthropic: ModelProviderCard = { description: 'Most powerful model for highly complex tasks. Top-level performance, intelligence, fluency, and understanding', displayName: 'Claude 3 Opus', + functionCall: true, id: 'claude-3-opus-20240229', maxOutput: 4096, tokens: 200_000, @@ -24,6 +26,7 @@ const Anthropic: ModelProviderCard = { description: 'Fastest and most compact model for near-instant responsiveness. Quick and accurate targeted performance', displayName: 'Claude 3 Haiku', + functionCall: true, id: 'claude-3-haiku-20240307', maxOutput: 4096, tokens: 200_000, diff --git a/src/libs/agent-runtime/anthropic/index.ts b/src/libs/agent-runtime/anthropic/index.ts index 0ab9bf7b..e4134cb0 100644 --- a/src/libs/agent-runtime/anthropic/index.ts +++ b/src/libs/agent-runtime/anthropic/index.ts @@ -1,6 +1,7 @@ // sort-imports-ignore import '@anthropic-ai/sdk/shims/web'; import Anthropic from '@anthropic-ai/sdk'; +import { Tool } from '@anthropic-ai/sdk/resources/beta/tools/messages'; import { AnthropicStream, StreamingTextResponse } from 'ai'; import { ClientOptions } from 'openai'; @@ -20,29 +21,42 @@ const DEFAULT_BASE_URL = 'https://api.anthropic.com'; export class LobeAnthropicAI implements LobeRuntimeAI { private client: Anthropic; - + baseURL: string; constructor({ apiKey, baseURL = DEFAULT_BASE_URL }: ClientOptions) { if (!apiKey) throw AgentRuntimeError.createError(AgentRuntimeErrorType.InvalidAnthropicAPIKey); - + this.client = new Anthropic({ apiKey, baseURL }); this.baseURL = this.client.baseURL; } async chat(payload: ChatStreamPayload, options?: ChatCompetitionOptions) { - const { messages, model, max_tokens, temperature, top_p } = payload; + const { messages, model, max_tokens, temperature, top_p, tools } = payload; const system_message = messages.find((m) => m.role === 'system'); const user_messages = messages.filter((m) => m.role !== 'system'); try { - const response = await this.client.messages.create({ + let anthropicTools = new Array<Tool>(); + if (tools) { + for (const tool of tools) { + let anthropicTool: Tool = { + description: tool.function.description, + input_schema: tool.function.parameters as Tool.InputSchema, + name: tool.function.name, + } + anthropicTools.push(anthropicTool); + } + } + + const response = await this.client.beta.tools.messages.create({ max_tokens: max_tokens || 4096, messages: buildAnthropicMessages(user_messages), model: model, stream: true, system: system_message?.content as string, temperature: temperature, + tools: anthropicTools, top_p: top_p, });
这个就能用?stream的问题解决了?
不不不,只是初步改了一下开头
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
No, no, no, I just changed the beginning a little bit.
✅ @AliceRabbit
This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。
:tada: This issue has been resolved in version 0.157.0 :tada:
The release is available on:
Your semantic-release bot :package::rocket: