litellm
litellm copied to clipboard
Include error message if no error text
Set error_text to the message attribute of the exception if error_text is not already set
- Use
getattrto retrieve themessageattribute from the exception object - Use the string representation of the exception if the
messageattribute is not present
Code:
import litellm
import os
model = 'openrouter/google/gemma-2-27b-it:free'
response = litellm.completion(
model=model,
messages=[{'role': 'user', 'content': 'Hello'*8555}],
)
print(response.choices[0].message.content)
Old Error:
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenrouterException -
New Error:
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenrouterException - This endpoint's maximum context length is 8192 tokens. However, you requested about 10708 tokens (10708 of text input). Please reduce the length of either one, or use the "middle-out" transform to compress your prompt automatically.
For more details, open the Copilot Workspace session.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| litellm | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Apr 23, 2025 2:58am |
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.