.Net Function Calling with Bedrock Claude
Hey there, am I missing something here or does function calling for Claude 3.5 Sonnet via the Bedrock connector simply not work? I know it's still in experimental, but is there a plan to support it?
Hey @MalteFries, unfortunately it's not working at the moment.
If you look at the code below it's expecting us to pass "tools" and "tool_choice" objects in the ExtensionData.
https://github.com/microsoft/semantic-kernel/blame/7a61c9caff11f73bed93687144de3afda85fd39d/dotnet/src/Connectors/Connectors.Amazon/Bedrock/Core/Models/Anthropic/AnthropicService.cs#L71C9-L82C10
But ClaudeToolUse.ClaudeTool is internal and it's not available outside the SK lib, so we can't use it.
@markwallace-microsoft, is anyone working on it? If I have some guidance I could try creating a PR.
My idea would be doing something similar as what happens on the OpenAI Connector, removing the need to check the ExtesionData.
https://github.com/microsoft/semantic-kernel/blob/7a61c9caff11f73bed93687144de3afda85fd39d/dotnet/src/Connectors/Connectors.OpenAI/Core/ClientCore.ChatCompletion.cs#L150
I also need function calling with bedrock (claude).
In my experiments I get an the following exception as soon as I add PromptExecutionSettings with FunctionChoiceBehavior.Auto():
Microsoft.SemanticKernel.Connectors.Amazon.Core.BedrockChatCompletionClient[0]
Can't converse with 'anthropic.claude-3-sonnet-20240229-v1:0'. Reason: 1 validation error detected: Value '0' at 'inferenceConfig.maxTokens' failed to satisfy constraint: Member must have value greater than or equal to 1
Amazon.BedrockRuntime.Model.ValidationException: 1 validation error detected: Value '0' at 'inferenceConfig.maxTokens' failed to satisfy constraint: Member must have value greater than or equal to 1
If it is really just a matter of an inaccessible property, why not get it using reflection? The connector is experimental after all.
I also need function calling with bedrock (claude).
In my experiments I get an the following exception as soon as I add
PromptExecutionSettingswithFunctionChoiceBehavior.Auto():Microsoft.SemanticKernel.Connectors.Amazon.Core.BedrockChatCompletionClient[0] Can't converse with 'anthropic.claude-3-sonnet-20240229-v1:0'. Reason: 1 validation error detected: Value '0' at 'inferenceConfig.maxTokens' failed to satisfy constraint: Member must have value greater than or equal to 1 Amazon.BedrockRuntime.Model.ValidationException: 1 validation error detected: Value '0' at 'inferenceConfig.maxTokens' failed to satisfy constraint: Member must have value greater than or equal to 1If it is really just a matter of an inaccessible property, why not get it using reflection? The connector is experimental after all.
Hey @halllo,
To make it work I had to set the "max_tokens_to_sample" property manually as you can see below:
PromptExecutionSettings promptExecutionSettings = new()
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(),
ModelId = modelid,
ExtensionData = new Dictionary<string, object>() {
{ "max_tokens_to_sample", 4096 }
},
};
And then I pass the prompt execution settings while calling the chat completion service:
var result = await chatcompletionservice.GetChatMessageContentAsync(
history,
executionSettings: promptExecutionSettings,
kernel: kernel);
See if that works for you!
Hello @Douuuglas.
Thanks for taking the time to take a look. With your proposed change I don’t see the exception anymore, but it also does not invoke the function. Is it invoking the function on your end?
Nice @halllo,
Unfortunately it won't work as I stated in the comment below, still needs to be implemented.
https://github.com/microsoft/semantic-kernel/issues/9750#issuecomment-2604762486
I have now built my own function calling abstraction over the IAmazonBedrockRuntime:
await agent.Do(
task: "Get the most popular song played on the radio station RGBG and rate it as bad.",
tools:
[
Agent.Tool.From([Description("Get radio song")]([Description("The call sign for the radio station for which you want the most popular song."), Required] string sign)
=> new { songName = "Random Song 1" }),
Agent.Tool.From([Description("Rate a song")](string song, string rating)
=> "Rated!"),
]);
This is the tool specification serialization: https://github.com/halllo/DotnetAgentExperiments/blob/main/aws.bed/Agent.cs#L123
How do we get some traction on this issue?
I stumbled across this myself and I thought I was doing something wrong since my plugins are correctly auto-invoking against the Azure OpenAI connector but not on the Bedrock Claude connector. It would be nice if the library itself would throw an exception or perhaps log a warning that we can't use plugins on this connector if it is truly unsupported today - it would save developers time on digging for a resolution.
Lost a good few hours to this, I cant get it working on any Bedrock model claude, mistral or llama. Is there any Bedrock model I can use where it can invoke plugin functions currently?
+1 for the need of getting this fixed. ty!
Unable to use SK because of this issue. +1 to having this fixed. thanks!
I am excited to see this moved into a current sprint -- just a note that in my experimentation, it seems like converting from the Amazon Bedrock Microsoft.Extensions.AI based IChatClient to IChatCompletionService is working and I do see function calls occurring - but the function call detail doesn't seem to make it back into the SK ChatHistory object correctly. Additionally, the AutoInvoke filters are also not firing.
Will #11628 / #10319 help solve? Maybe that is a more straightforward pattern here that offers broader support to more connectors in general by skipping the Bedrock connector that is built into SK and cut over to MEAI when the supporting code is ready?
Thanks @RogerBarreto for all of your hard work on these PR's; the progress being made here is fantastic.
Will https://github.com/microsoft/semantic-kernel/issues/11628 / https://github.com/microsoft/semantic-kernel/issues/10319 help solve? Maybe that is a more straightforward pattern here that offers broader support to more connectors in general by skipping the Bedrock connector that is built into SK and cut over to MEAI when the supporting code is ready?
Yes, this is already working in the feature branch, but as soon this branch gets in a merging ok state we will bring the feature to main, with that you wont need to do a IChatClient.AsChatCompletionService anymore as IChatClients will be natively identified and used in SK internals.
but the function call detail doesn't seem to make it back into the SK ChatHistory object correctly.
Not sure about this one, can you elaborate more, if that's the case, create an issue we can investigate individually.
Opened a PR against the AWS Runtime, if this one gets in this problem can be sorted using the AddBedrockChatClient() extensions.
- https://github.com/aws/aws-sdk-net/pull/3845