ollama
ollama copied to clipboard
Add support for function call (response back) (message.role=tool)
Add support for function call (response back)
- Currently there's no support for sending back the function call result to the model using the
role=toolmessages. - Using the native API (not openai), function tool calls don't have an identifier associated
tool_call_id, this is present in theopenaiAPI, and is important to be available on both APIs.
[!IMPORTANT] This ID is very when providing the result back to the model (in a chat history where the same function was invoked multiple times in a chat history with different results) for the model to reason about.

Broken image link.
Function call results can use role=tool if the template supports it, eg {{- else if eq .Role "tool" }} in the llama3.2 template.
The tool id would be nice to have, in practice I haven't found it an issue, probably because in my use cases, the completion contains only one or two tool calls.
Hi, any update on this?
I have probably the same issue, using default models (templates) provided with models by ollama.
Using the native API, role=tool response message are present in history, but multiple tried models (lama3.1, qwen2.5), are responding that they don't see tool response, when asked. Unless provided with comment as role=user, which should not be the correct way.
(Tested models are able to make valid tool calls)
Thanks in advance.