discourse-ai
discourse-ai copied to clipboard
FEATURE: Tools for models from Ollama provider
This pull request adds support for native tools and tool calls in the Ollama dialect, including the implementation of the OllamaTools class and its integration into existing methods. The changes also include test coverage for the new functionality.
NOTE: regarding the ollama API doc, we would have to pass stream: false if we want to use tools https://github.com/ollama/ollama/blob/main/docs/api.md#parameters-1, as a result, we can't utilise @streaming_mode here but have to do the trick.
- If dialect has tools or the last message in the conversation is a tool role, we would treat that call as stream: false call.
- If not, we stream the call
Demo
https://github.com/user-attachments/assets/9a689b6b-95ff-4347-af09-4ba164786baa
This is not ideal, I would say a couple of things here:
- native tool support should be optional, like we have other optional llm features (see llm_model.lookup_custom_param)
- I think this is too limiting cause you are forcing it so user can only use one tool, what if the llm wants to call a tool on the second turn? I think you need to totally give up streaming to get the correct behavior.
btw :wave: from Sydney!
Thanks, @SamSaffron, I'm still in limbo actually, that's why I put that note in the description as it depends on your product vision as well. But if you said so, I'm on board with the idea that we can totally give up streaming to get the proper behaviour. It would also help us eliminate the trick, which, in my experience, is quite difficult to debug (in the case when we thought it was a streaming call but not).
I still stand behind thinking this should be optional, we get tons of mileage from XML tools, so it is nice to choose if you prefer XML tools and streaming vs native tools and not streaming.
will do my homework about XML tools and circle back to this PR next week, cheers.
Hey @SamSaffron, I’ve made the native tool for the Ollama provider optional. Let me know if you think the provider parameter key needs any adjustments.
Minor, can you only include English translations, we have an automatic process for translation so this will just cause confusion, plus Hebrew translation is way off anyway :)
copy is a bit confusing... let's go with "Enable native tool support" - a bit more verbose but copy is a lot clearer.
overall change looks good, only those minor bits need improving.
OH I think we are missing something here and we should add tests.
If native tools are disabled you should still get tool support via xml tools, we should fall back to it.
Thanks heaps!
One thing I am wondering about, there are just so many providers these days with OpenAI "sort of compatible" endpoints. I wonder if we just go with a single "OpenAI Compatible" dialect / endpoint and then add enough options to make it work across the board, (eg: together.ai / grok / ollama ... etc...)
Yes, the more I work with these providers, the more I notice they share the same API interface. It might be time to revisit and refactor their endpoints, dialects and tool classes. However, we'll need to add more tests before proceeding, I think.