litellm
litellm copied to clipboard
[Bug]: False alert raised
What happened?
We have litellm deployed in our environments and every time we access the Admin dashboard and we're requested to enter the user and password, an alert is raised from LiteLLM
Alert type: llm_exceptions Level: High Timestamp: 12:24:07 Message: LLM API call failed: ``
Is that expected? Is something we can configure differently to avoid the false alerts?
Relevant log output
{
"status": "failure",
"batch_models": null,
"usage_object": null,
"user_api_key": "32575b452138107d5fa612e89b19ac8500f62eb2ca91a44ab615e722b05f23ce",
"error_information": {
"traceback": " File \"/usr/lib/python3.13/site-packages/starlette/_exception_handler.py\", line 42, in wrapped_app\n await app(scope, receive, sender)\n File \"/usr/lib/python3.13/site-packages/fastapi/routing.py\", line 110, in app\n response = await f(request)\n ^^^^^^^^^^^^^^^^\n File \"/usr/lib/python3.13/site-packages/fastapi/routing.py\", line 380, in app\n solved_result = await solve_dependencies(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\n ...<6 lines>...\n )\n ^\n File \"/usr/lib/python3.13/site-packages/fastapi/dependencies/utils.py\", line 673, in solve_dependencies\n solved = await call(**solved_result.values)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/app/litellm/proxy/auth/user_api_key_auth.py\", line 1211, in user_api_key_auth\n user_api_key_auth_obj = await _user_api_key_auth_builder(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n ...<8 lines>...\n )\n ^\n File \"/app/litellm/proxy/auth/user_api_key_auth.py\", line 1176, in _user_api_key_auth_builder\n return await UserAPIKeyAuthExceptionHandler._handle_authentication_error(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n ...<6 lines>...\n )\n ^\n File \"/app/litellm/proxy/auth/auth_exception_handler.py\", line 118, in _handle_authentication_error\n raise e\n File \"/app/litellm/proxy/auth/user_api_key_auth.py\", line 1030, in _user_api_key_auth_builder\n raise ProxyException(\n ...<4 lines>...\n )\n",
"error_code": "",
"error_class": "ProxyException",
"llm_provider": "",
"error_message": ""
},
"applied_guardrails": null,
"user_api_key_alias": null,
"user_api_key_org_id": null,
"user_api_key_team_id": null,
"user_api_key_user_id": null,
"guardrail_information": null,
"model_map_information": null,
"mcp_tool_call_metadata": null,
"additional_usage_values": {},
"cold_storage_object_key": null,
"user_api_key_team_alias": null,
"vector_store_request_metadata": null
}
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
litellm-non_root:main-v1.80.0.dev2
Twitter / LinkedIn details
No response