Tool calling not working with ollama?
Tool usage doesn't seem to work with local models hosted on ollama... it just prints the tool calls as part of the conversation, like this (using MFDoom/deepseek-r1-tool-calling:7b model):
### What time is it?
{"name": "current_time", "parameters": {"include_date": true, "include_time": true}}
I do have the current_time tool defined, and that would be a valid call... but it doesn't actually call it, jit ust prints it in the chat.
I'm not sure if this is a problem with the ollama backend or if I'm doing something incorrectly, please advise, thank you.
ariane-emory created an issue (karthink/gptel#1024)
Tool usage doesn't seem to work with local models hosted on ollama... it just prints the tool calls as part of the conversation, like this (using MFDoom/deepseek-r1-tool-calling:7b model):
### What time is it? {"name": "current_time", "parameters": {"include_date": true, "include_time": true}}I'm not sure if this is a problem with the ollama backend or if I'm doing something incorrectly, please advise, thank you.
Please generate a log of the interaction and paste it here. The bug report template on the create new issue page explains how to access the log. (There is no need to create a new issue.)
Here's the log of a simple test case with a current_time tool. The tool itself seems alright, I've asked cloud LLMs (GPT/Gemini/Codestral) the same question and they have no problems using the tool and telling me what time it is, but so far I haven't had any luck getting local models to actually use the tool, they instead just spit the JSON for the tool call into the chat.
Here's a log:
{
"gptel": "request headers",
"timestamp": "2025-08-14 04:37:13"
}
{
"Content-Type": "application/json"
}
{
"gptel": "request body",
"timestamp": "2025-08-14 04:37:13"
}
{
"model": "MFDoom/deepseek-r1-tool-calling:7b",
"messages": [
{
"role": "system",
"content": "You are a large language model living in Emacs and a helpful assistant. Provide detailed responses."
},
{
"role": "user",
"content": "What time is it?"
}
],
"stream": false,
"tools": [
{
"type": "function",
"function": {
"name": "current_time",
"description": "get the current time",
"parameters": {
"type": "object",
"properties": {
"include_date": {
"type": "boolean",
"description": "if true, include the date in the result"
},
"include_time": {
"type": "boolean",
"description": "if true, include the time in the result"
}
},
"required": [
"include_date",
"include_time"
],
"additionalProperties": false
}
}
}
],
"options": {
"temperature": 1.0
}
}
{"gptel": "request Curl command", "timestamp": "2025-08-14 04:37:13"}
curl \
--disable \
--location \
--silent \
--compressed \
-XPOST \
-y7200 \
-Y1 \
-D- \
-w\(55804a5b14cd5070219c13f55f45bec7\ .\ \%\{size_header\}\) \
-d\{\"model\"\:\"MFDoom/deepseek-r1-tool-calling\:7b\"\,\"messages\"\:\[\{\"role\"\:\"system\"\,\"content\"\:\"You\ are\ a\ large\ language\ model\ living\ in\ Emacs\ and\ a\ helpful\ assistant.\ Provide\ detailed\ responses.\"\}\,\{\"role\"\:\"user\"\,\"content\"\:\"What\ time\ is\ it\?\"\}\]\,\"stream\"\:false\,\"tools\"\:\[\{\"type\"\:\"function\"\,\"function\"\:\{\"name\"\:\"current_time\"\,\"description\"\:\"get\ the\ current\ time\"\,\"parameters\"\:\{\"type\"\:\"object\"\,\"properties\"\:\{\"include_date\"\:\{\"type\"\:\"boolean\"\,\"description\"\:\"if\ true\,\ include\ the\ date\ in\ the\ result\"\}\,\"include_time\"\:\{\"type\"\:\"boolean\"\,\"description\"\:\"if\ true\,\ include\ the\ time\ in\ the\ result\"\}\}\,\"required\"\:\[\"include_date\"\,\"include_time\"\]\,\"additionalProperties\"\:false\}\}\}\]\,\"options\"\:\{\"temperature\"\:1.0\}\} \
-HContent-Type\:\ application/json \
http\://localhost\:11434/api/chat
{
"gptel": "response headers",
"timestamp": "2025-08-14 04:38:16"
}
"HTTP/1.1 200 OK\r\nContent-Type: application/json; charset=utf-8\r\nDate: Thu, 14 Aug 2025 08:38:16 GMT\r\nContent-Length: 1947\r\n\r"
{
"gptel": "response body",
"timestamp": "2025-08-14 04:38:16"
}
{
"model": "MFDoom/deepseek-r1-tool-calling:7b",
"created_at": "2025-08-14T08:38:16.6457Z",
"message": {
"role": "assistant",
"content": "<think>\nOkay, so I need to figure out what time it is right now using the available tools. The user provided a tool called \"current_time\" which seems perfect for this task. Let me think about how that works.\n\nFirst, I should check if the function requires any parameters. Looking at the description, it says the current_time function needs two boolean parameters: include_date and include_time. Both are optional but default to true? Or maybe they have defaults based on whether something is required?\n\nWait, no, in the parameters section, both include_date and include_time are described as having the property type boolean with descriptions. So I guess each of them has a default value if not specified.\n\nBut since the user didn't specify any additional arguments, perhaps it's safer to leave those as true unless there's a reason not to. In this case, maybe we should just call current_time without any parameters so that include_date and include_time are both true by default. That way, it will return both date and time.\n\nAlternatively, if I wanted only the time part, perhaps I'd set include_date to false. But since the user just asked for \"what time is it?\", maybe including the date isn't necessary unless specified. However, given that the current_time function can handle it, and using default parameters is safe, I'll proceed without adding any arguments.\n\nSo, the call would be {\"name\": \"current_time\", \"parameters\": {}} because there are no additional required parameters beyond what's included by default.\n</think>\n\n{\"name\": \"current_time\", \"parameters\": {}}"
},
"done_reason": "stop",
"done": true,
"total_duration": 62762803000,
"load_duration": 31049123708,
"prompt_eval_count": 218,
"prompt_eval_duration": 2115273458,
"eval_count": 322,
"eval_duration": 29574813292
}
It looks like this model is generating tool calls as part of the response text, so there's nothing gptel can do to distinguish the two.
I suspect the problem is the model here. Have you tried this with any other model?
I'm closing this as this is not a gptel bug -- the model is not acting in accordance with Ollama's tool calling API.