Graham Neubig

Results 874 comments of Graham Neubig

Ah, @xingyaoww , any ideas how we could possibly debug this?

OK, sounds good, will do.

There's already an issue open: https://github.com/BerriAI/litellm/issues/6629

OpenHands is already working on it btw.

PR sent upstream: https://github.com/BerriAI/litellm/pull/6973

Oh, thanks so much. I'll send a separate PR.

OK, here's the open PR: https://github.com/BerriAI/litellm/pull/6975

This should be fixed upstream: https://github.com/BerriAI/litellm/pull/6994 @xingyaoww , when the new release is made could you re-deploy the proxy?

Hey @xingyaoww , I'm still experiencing this with the most recent version of the proxy, I just saw it now: ``` litellm.BadRequestError: OpenAIException - Error code: 500 - {'error': {'message':...

Same here, we're specifically getting this on `File "/usr/lib/python3.13/site-packages/litellm/proxy/db/db_spend_update_writer.py", line 862, in _update_daily_spend`