litellm
litellm copied to clipboard
[Feature]: Raise last error message if retries doesn't work
What happened?
User only sees "IndexError: list index out of range", for example in LibreChat
Relevant log output
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: raise RateLimitError(
litellm[2926698]: litellm.exceptions.RateLimitError: BedrockException: Rate Lim>
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: response = client.invoke_model_with_response_stream(
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: return self._make_api_call(operation_name, kwargs)
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: raise error_class(parsed_response, operation_name)
litellm[2926698]: botocore.errorfactory.ThrottlingException: An error occurred >
litellm[2926698]: During handling of the above exception, another exception occ>
litellm[2926698]: Traceback (most recent call last):
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: responses = await asyncio.gather(
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: raise e
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: response = await self.async_function_with_fallbacks(**kwa>
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: raise original_exception
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: response = await self.async_function_with_retries(*args, >
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: raise e
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: response = await original_function(*args, **kwargs)
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: raise e
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: deployment = await self.async_get_available_deployment(
litellm[2926698]: File "/home/librechat/litellm/lib64/python3.9/site-packages>
litellm[2926698]: rpm = healthy_deployments[0].get("litellm_params").get("r>
litellm[2926698]: IndexError: list index out of range
litellm[2926698]: INFO: 192.168.16.6:58184 - "POST /chat/completions HTTP/1>
Twitter / LinkedIn details
No response
Fixed this - https://github.com/BerriAI/litellm/commit/f0e48cdd53be4d8591b3e91ea5776b2d84163da7
@dirkpetersen We raise a better exception here. I believe this is caused by trying to retry against this model group.
What would expect to happen here?
i added comment here: https://github.com/BerriAI/litellm/commit/f0e48cdd53be4d8591b3e91ea5776b2d84163da7
yup - followed up
Updating based on the conversation on the commit @dirkpetersen