Tool calling problem with non-llama models using `langgraph_react_agent` template
Question: Non-Llama models not calling tools when using langgraph_react_agent template
When using the langgraph_react_agent template the models do not call the tools as expected.
If I use regular langgraph code directly in Python, everything works fine and the tools are invoked with the same prompt(except for GPT-OSS).
- Code samples
tools.py
from langchain_core.tools import tool
@tool("multiply", parse_docstring=True)
def multiply(a: float, b: float) -> float:
"""
Multiplies a times b and returns the result as float
Args:
a: first number to multiply
b: second number to multiply
Returns:
product result
"""
print("\nMultiply tool called with parameters: {}, {}".format(a, b))
return a * b
init.py
from .tools import *
TOOLS = [multiply]
agent.py
from typing import Callable
from ibm_watsonx_ai import APIClient
from langchain_ibm import ChatWatsonx
from langgraph.prebuilt import create_react_agent
from langgraph_react_agent_base import TOOLS
def get_graph_closure(client: APIClient, model_id: str) -> Callable:
"""Graph generator closure."""
# Initialise ChatWatsonx
chat = ChatWatsonx(
model_id=model_id,
watsonx_client=client,
params={
"temperature": 0.01,
"max_new_tokens": 512,
"penalty_repetition": 1.1
}
)
def get_graph():
"""Get compiled graph with overwritten system prompt, if provided"""
# Create instance of compiled graph
return create_react_agent(
chat, tools=TOOLS, state_modifier="use your tool to do math"
)
return get_graph
- Current situation
Below is the observed output for each model ID:
- model_id = "meta-llama/llama-3-405b-instruct" ✅
watsonx-ai template invoke "what is 43.2 * 12.563?"
============================== Assistant Message ===============================
[{'id': 'chatcmpl-tool-0987638136494ecea0d8af3d0eb1d933', 'type': 'function', 'function': {'name': 'multiply', 'arguments': '{"a": "43", "b": "12"}'}}]
Multiply tool called with parameters: 43.0, 12.0
================================= Tool Message =================================
516.0
============================== Assistant Message ===============================
The result of 43.2 * 12.563 is 516.0.
- model_id = "meta-llama/llama-3-2-90b-vision-instruct" ✅
watsonx-ai template invoke "what is 43.2 * 12.563?"
============================== Assistant Message ===============================
[{'id': 'chatcmpl-tool-3074141977964ff39067c849c5e09779', 'type': 'function', 'function': {'name': 'multiply', 'arguments': '{"a": "43", "b": "12"}'}}]
Multiply tool called with parameters: 43.0, 12.0
================================= Tool Message =================================
516.0
============================== Assistant Message ===============================
The result of 43.2 * 12.563 is 516.0.
- model_id = "openai/gpt-oss-120b" ❌
watsonx-ai template invoke "what is 43.2 * 12.563?"
============================== Assistant Message ===============================
\(43.2 \times 12.563 = 542.7216\)
GPT-OSS tends to not use tools, and when "forced", throws an error (even without using this template)
- model_id = "meta-llama/llama-4-maverick-17b-128e-instruct-fp8" ✅
============================== Assistant Message ===============================
[{'id': 'chatcmpl-tool-443e39248db14b428260bd2f275b2315', 'type': 'function', 'function': {'name': 'multiply', 'arguments': '{"a": 43.2, "b": 12.563}'}}]
Multiply tool called with parameters: 43.2, 12.563
================================= Tool Message =================================
542.7216000000001
============================== Assistant Message ===============================
The result of multiplying 43.2 by 12.563 is 542.7216.
- model_id = "meta-llama/llama-3-3-70b-instruct" ✅
watsonx-ai template invoke "what is 43.2 * 12.563?"
============================== Assistant Message ===============================
[{'id': 'chatcmpl-tool-48b7acadc5b340e8a25d010ef7791cf5', 'type': 'function', 'function': {'name': 'multiply', 'arguments': '{"a": 43.2, "b": 12.563}'}}]
Multiply tool called with parameters: 43.2, 12.563
================================= Tool Message =================================
542.7216000000001
============================== Assistant Message ===============================
The result of the multiplication is 542.7216.
- model_id = "mistralai/mistral-large" ❌
watsonx-ai template invoke "what is 43.2 * 12.563?"
LifecycleWarning: Model 'mistralai/mistral-large' is in deprecated state from 2025-07-09 until 2025-10-08. IDs of alternative models: mistralai/mistral-medium-2505. Further details: https://dataplatform.cloud.ibm.com/docs/content/wsj/analyze-data/fm-model-lifecycle.html?context=wx&audience=wdp
warn(model_state_warning, category=LifecycleWarning)
============================== Assistant Message ===============================
[{'id': 'BW94OA23o', 'type': 'function', 'function': {'name': 'multiply', 'arguments': '{"a":, "b": 12.563{"a": 43.2, "b": 12.563}'}}]
================================= Tool Message =================================
Error: 2 validation errors for multiply
a
Field required [type=missing, input_value={}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.11/v/missing
b
Field required [type=missing, input_value={}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.11/v/missing
Please fix your mistakes.
============================== Assistant Message ===============================
[{'id': 'vB1KVKURr', 'type': 'function', 'function': {'name': 'multiply', 'arguments': '{"a":, "b": 12.563{"a": 43.2, "b": 12.563}'}}]
================================= Tool Message =================================
Error: 2 validation errors for multiply
a
Field required [type=missing, input_value={}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.11/v/missing
b
Field required [type=missing, input_value={}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.11/v/missing
Please fix your mistakes.^C%
- model_id = "mistralai/mistral-medium-2505" ❌
watsonx-ai template invoke "what is 43.2 * 12.563?"
- model_id = "mistralai/mistral-small-3-1-24b-instruct-2503" ❌
watsonx-ai template invoke "what is 43.2 * 12.563?"
============================== Assistant Message ===============================
[{'id': 'x59oa0NDB', 'type': 'function', 'function': {'name': 'multiply', 'arguments': '{"a":, "b": 12.563{"a": 43.2, "b": 12.563}'}}]
================================= Tool Message =================================
Error: 2 validation errors for multiply
a
Field required [type=missing, input_value={}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.11/v/missing
b
Field required [type=missing, input_value={}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.11/v/missing
Please fix your mistakes.
@JulioSanchezD thank you for reporting this issue. We are investigating it.
Hi @JulioSanchezD , there is a problem with streaming in the models. However, you can set the stream flag in config.toml to false, and the agent should call tools correctly.
[cli.options]
# If true, cli `invoke` command is trying to use `ai_service.generate_stream` function for local tests, and `ai_service.generate` otherwise.
# Default: true
stream = false