litellm icon indicating copy to clipboard operation
litellm copied to clipboard

Fixing Open AI tool calling result syntax for Ollama models

Open ChristianWeyer opened this issue 11 months ago • 3 comments

See https://github.com/BerriAI/litellm/issues/2209#issuecomment-2007993125

ChristianWeyer avatar Mar 20 '24 09:03 ChristianWeyer

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Mar 20, 2024 2:23pm

vercel[bot] avatar Mar 20 '24 09:03 vercel[bot]

Hitting issues related to the same problem with arguments response being the incorrect format for Ollama models wrapped in the proxy and attempting function calling. Would be great to see this released!

tomatau avatar Mar 30 '24 16:03 tomatau

Another hand up for this - I'm trying to utilise LiteLLM+Ollama function calling with AutoGen.

I noticed that with one function it appears to populate the top-level name correctly, but when I have two functions the response doesn't populate the top-level name value. Not sure if it's related to this specifically.

When one function available, response:

'{\'arguments\': \'{"name": "currency_calculator", "arguments": {"base_amount": 123.45, "base_currency": "EUR", "quote_currency": "USD"}}\', \'name\': \'currency_calculator\'}'

And when two functions are available, response:

'{\'arguments\': \'{"name": "currency_calculator", "arguments": {"base_amount": 123.45, "base_currency": "EUR", "quote_currency": "USD"}}\', \'name\': \'\'}'

marklysze avatar Apr 11 '24 04:04 marklysze

What are your plans to make OpenAI tool calling bug-free @krrishdholakia ? Could this PR help somehow?

Thx!

ChristianWeyer avatar Apr 12 '24 09:04 ChristianWeyer