litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: Error when passing Function call response with Anthropic on Bedrock - 'ChatCompletionMessageToolCall' object is not subscriptable

Open repko-artem opened this issue 1 year ago • 4 comments

What happened?

When calling tools using model anthropic (I'm using bedrock/anthropic.claude-3-sonnet) I always get error: Exception 'ChatCompletionMessageToolCall' object is not subscriptable

The problem in /prompt_templates/factory.py, method convert_to_anthropic_tool_invoke(tool_calls: list), because you can't access attributes like this tool["type"] We need to use tool.type instead.

All we need - to replace such calls in this method.

Relevant log output

File "C:\Projects\!other\lightLLMTest\app.py", line 139, in main
    completion_result = completion(
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\utils.py", line 2775, in wrapper
    raise e
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\utils.py", line 2672, in wrapper
    result = original_function(*args, **kwargs)
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\main.py", line 2064, in completion
    raise exception_type(
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\utils.py", line 8262, in exception_type
    raise e
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\utils.py", line 7374, in exception_type
    raise ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: BedrockException - Traceback (most recent call last):
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\llms\bedrock.py", line 745, in completion
    messages = prompt_factory(
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\llms\prompt_templates\factory.py", line 1044, in prompt_factory
    return anthropic_messages_pt(messages=messages)
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\llms\prompt_templates\factory.py", line 693, in anthropic_messages_pt
    assistant_text += convert_to_anthropic_tool_invoke(
  File "C:\Projects\!other\lightLLMTest\.venv\lib\site-packages\litellm\llms\prompt_templates\factory.py", line 612, in convert_to_anthropic_tool_invoke
    if tool["type"] != "function":
TypeError: 'ChatCompletionMessageToolCall' object is not subscriptable


### Twitter / LinkedIn details

_No response_

repko-artem avatar Apr 04 '24 13:04 repko-artem

Missed this. Thanks for the issue @repko-artem I'll work on repro'ing this + having a fix out ASAP.

We already have testing on the bedrock function calling integration so i'm surprised this didn't get caught.

krrishdholakia avatar Apr 06 '24 20:04 krrishdholakia

Unable to repro this bug @repko-artem Screenshot 2024-04-06 at 1 41 09 PM

Here's my call:

import litellm 
from litellm import ModelResponse, completion

litellm.set_verbose = True
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {
                        "type": "string",
                        "enum": ["celsius", "fahrenheit"],
                    },
                },
                "required": ["location"],
            },
        },
    }
]
messages = [
    {"role": "user", "content": "What's the weather like in Boston today?"}
]
response: ModelResponse = completion(
    model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0",
    messages=messages,
    tools=tools,
    tool_choice="auto",
)
print(f"response: {response}")
# Add any assertions here to check the response
assert isinstance(response.choices[0].message.tool_calls[0].function.name, str)
assert isinstance(
    response.choices[0].message.tool_calls[0].function.arguments, str
)
        

krrishdholakia avatar Apr 06 '24 20:04 krrishdholakia

Oh wait - i see an issue when we pass in the result of a function call back into completion

Got it. We were missing this test for bedrock. Is this what you're doing as well? @repko-artem

litellm.set_verbose = True
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
                "required": ["location"],
            },
        },
    }
]
messages = [{"role": "user", "content": "What's the weather like in Boston today?"}]
# test without max tokens
response = completion(
    model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0",
    messages=messages,
    tools=tools,
    tool_choice="auto",
)
# Add any assertions, here to check response args
print(response)
assert isinstance(response.choices[0].message.tool_calls[0].function.name, str)
assert isinstance(
    response.choices[0].message.tool_calls[0].function.arguments, str
)

messages.append(
    response.choices[0].message.model_dump()
)  # Add assistant tool invokes
tool_result = (
    '{"location": "Boston", "temperature": "72", "unit": "fahrenheit"}'
)
# Add user submitted tool results in the OpenAI format
messages.append(
    {
        "tool_call_id": response.choices[0].message.tool_calls[0].id,
        "role": "tool",
        "name": response.choices[0].message.tool_calls[0].function.name,
        "content": tool_result,
    }
)
# In the second response, Claude should deduce answer from tool results
second_response = completion(
    model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0",
    messages=messages,
    tools=tools,
    tool_choice="auto",
)
print(second_response)

krrishdholakia avatar Apr 06 '24 20:04 krrishdholakia

@krrishdholakia Thanks a lot! That's just the case.

repko-artem avatar Apr 08 '24 15:04 repko-artem