ollama icon indicating copy to clipboard operation
ollama copied to clipboard

Add support for function call (response back) (message.role=tool)

Open rogerbarreto opened this issue 1 year ago • 2 comments

Add support for function call (response back)

  1. Currently there's no support for sending back the function call result to the model using the role=tool messages.
  2. Using the native API (not openai), function tool calls don't have an identifier associated tool_call_id, this is present in the openai API, and is important to be available on both APIs.

[!IMPORTANT] This ID is very when providing the result back to the model (in a chat history where the same function was invoked multiple times in a chat history with different results) for the model to reason about.

image

rogerbarreto avatar Nov 05 '24 13:11 rogerbarreto

Broken image link.

Function call results can use role=tool if the template supports it, eg {{- else if eq .Role "tool" }} in the llama3.2 template.

The tool id would be nice to have, in practice I haven't found it an issue, probably because in my use cases, the completion contains only one or two tool calls.

rick-github avatar Nov 05 '24 14:11 rick-github

Hi, any update on this?

I have probably the same issue, using default models (templates) provided with models by ollama.

Using the native API, role=tool response message are present in history, but multiple tried models (lama3.1, qwen2.5), are responding that they don't see tool response, when asked. Unless provided with comment as role=user, which should not be the correct way.

(Tested models are able to make valid tool calls)

Thanks in advance.

toomeenoo avatar Dec 06 '24 17:12 toomeenoo