Support for generate and return tool_calls with Anthropic models
The proposed changes include:
1/ Ability to use tools with .generate()
2/ Returning stop_reason and tool_calls to AIMessage when using tools in the response metadata
Support for function call using the generate function not directly implemented in ChatBedrock.
from bedrock import ChatBedrock
chat = ChatBedrock(
model_id=model_id,
model_kwargs={"temperature": 0.1},
)
class GetWeather(BaseModel):
"""Get the current weather in a given location"""
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
llm_with_tools = chat.bind_tools([GetWeather])
llm_with_tools
messages = [
HumanMessage(
content="what is the weather like in San Francisco"
)
]
ai_msg = llm_with_tools.generate(messages)
ai_msg
Hi, Is there anything we can do to allow the PR to get merged ?
Hi, is the tool_choice argument of bind_tools method functioning? Because I would like to force the LLM to use a tool, but it does not seem to work. Are there any workarounds to do so? Thank you in advance.
For those coming across this PR in search of a solution, I had an issue where the tools were not actually being called. I have created a PR against the fork created by @bigbernnn which can be found here. This has resolved my related issue here.
Can this please be reviewed and merged? Can't switch my existing codebase to bedrock due to this error.
# llm = ChatOpenAI(model="gpt-4o", openai_api_key=OPENAI_API_KEY) [WORKS FINE]
# llm = ChatAnthropic(model="claude-3-opus-20240229") [WORKS FINE]
llm = ChatBedrock(
model_id="anthropic.claude-3-sonnet-20240229-v1:0",
model_kwargs=dict(temperature=0),
) [ERROR]
ValueError: System message must be a string, instead was: <class 'list'>