ollama
ollama copied to clipboard
Tools support in ChatCompletion endpoints
We'd love tools support so we can use ollama with our existing OpenAI-using apps. Not sure if that's possible across the board with all models.
+1
Does that mean we should implement something similar to OpenAI's Assistant API? Like code interpreter, retreival, etc., right?
My request is only that the chat completion endpoint support the tools parameter (and related tool_choice): https://platform.openai.com/docs/api-reference/chat#chat/create-functions
I do see that assistants also support tools, but that would be a much bigger feature.
My request is only that the chat completion endpoint support the tools parameter (and related tool_choice): https://platform.openai.com/docs/api-reference/chat#chat/create-functions
I do see that assistants also support tools, but that would be a much bigger feature.
+1 updated link https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools
+1 Mistral model
https://docs.mistral.ai/guides/function-calling/
https://docs.mistral.ai/guides/function-calling/
is the tools argument defined in the same way in Mistral and OpenAI ?
# mistral
response = client.chat(model=model, messages=messages, tools=tools, tool_choice="auto")
..
#openai
response = client.chat.completions.create(
model=model,
messages=messages,
tools=tools,
tool_choice=tool_choice,
)
new function calling model:
https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B
I suggest looking at the llama cpp python implementation for functionary (a chat handler).
+1
any updates ? A must to have feature IMO
Thanks for the issue! Merging with #4386!