MalteHB
MalteHB
You should ideally do it across the repo :-)
> Sure @Sameerlite have you looked into this yet? We are struggling a lot with it. I might take a look if you haven't gotten to it yet, but that...
How are people actually using both OpenAI and Anthropic models simultaneously with tool calls with the LiteLLM proxy?
> Closing it this week. The PR for error through v1/messages endpoint has been raised for streaming/non-streaming What does that mean? We are still seeing errors, and it means that...
> ack @Sameerlite can you link the pr to this ticket, so we know it's closing this issue But it is still an issue, sorry
Hi @krrishdholakia @ishaan-jaff This issue is still happening In which release can we expect a fix?