Support tools in OpenAI-compatible API
Support the tools and tool_choice parameters in the OpenAI-compatible API. Currently these are not supported https://github.com/ollama/ollama/blob/4ec7445a6f678b6efc773bb9fa886d7c9b075577/docs/openai.md#supported-request-fields
I believe llama.cpp is used internally by Ollama, and this has support for tools and tool_choice https://github.com/abetlen/llama-cpp-python#function-calling so the implementation in Ollama can hopefully leverage this.
Having the tools parameter implemented in Ollama (or llama.cpp and surfacing this) would standardize all downstream packages (e.g. https://github.com/BerriAI/litellm , https://github.com/jackmpcollins/magentic) on a single prompt and implementation, which will make it more robust.
Related issues
- https://github.com/ollama/ollama/issues/305
- https://github.com/ollama/ollama/issues/1729
- https://github.com/ollama/ollama/issues/3165
OpenAI compatibility for curl https://api.openai.com/v1/chat/completions
-H "Content-Type: application/json"
-H "Authorization: Bearer $OPENAI_API_KEY"
-d '{
"model": "gpt-4-turbo",
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "Whatβs in this image?"
},
{
"type": "image_url",
"image_url": {
"url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg"
}
}
]
}
],
"max_tokens": 300
}'
+1
+1
+1 Looks like I'm too early to the party yet again π
I'd like to contribute to this, however I'm not familiar with the repo. If someone can lay out what needs to be done I'd be happy to help.
+1
+1
+1
+1
Hello @jmorganca , any timeline for this one ?
@jmorganca even a rough estimate would be highly appreciated, thank you!
@langchain4j uhh... I think maybe it was just merged, actually? I can't really tell but the PR name and description lines up
https://github.com/ollama/ollama/pull/5614
@jmorganca, is this going to be part of the next release?
@langchain4j uhh... I think maybe it was just merged, actually? I can't really tell but the PR name and description lines up
#5614
I think this is just a part of what needs to be done to have tools working in ollama ?
@humcqc yes, agreed. I've seen lots of commits go thru now about tools
Here's an X post I saw about tools with a demo if anyone is curious; it was mentioned in their keynote.
https://x.com/AlexReibman/status/1814142347367817443
We are close!!!
Might I also add - llama3.1's template already has tools baked in (as I'm sure others do as well) π
https://ollama.com/library/llama3.1/blobs/11ce4ee3e170
I'll believe it when the dang "Function calling" checkbox is checked π I check every day and it's getting unhealthy https://github.com/ollama/ollama/blob/main/docs/openai.md#endpoints
I'll believe it when the dang "Function calling" checkbox is checked π I check every day and it's getting unhealthy https://github.com/ollama/ollama/blob/main/docs/openai.md#endpoints
Yes seems we are close!!
Hi there, this is now supported as of 0.3.0 https://ollama.com/blog/tool-support
Note: OpenAI streaming tool calling isn't yet implemented, but this is something being worked on
Awesome! I tested this for a RAG app and found that I had to use few shot examples to get good results, at least from llama3.1. Blog with more details: https://blog.pamelafox.org/2024/08/making-ollama-compatible-rag-app.html