[Bug]: Valid config keys have changed in V2
What happened?
An annoying pydantic warning triggered by: https://github.com/BerriAI/litellm/blob/f1540ceeab9a8ca1335a49b84be95e27ea7b89de/litellm/types/integrations/prometheus.py#L224
Would be amazing if you could fix this.
Relevant log output
/pydantic/_internal/_config.py:345: UserWarning: Valid config keys have changed in V2:
* 'fields' has been removed
warnings.warn(message, UserWarning)
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
latest
Twitter / LinkedIn details
No response
I suspect this is caused by the changes introduced in v1.56.2 with https://github.com/BerriAI/litellm/pull/7421, where you can see the following in litellm/types/integrations/prometheus.py:
class UserAPIKeyLabelValues(BaseModel):
end_user: Optional[str] = None
user: Optional[str] = None
hashed_api_key: Optional[str] = None
api_key_alias: Optional[str] = None
team: Optional[str] = None
team_alias: Optional[str] = None
requested_model: Optional[str] = None
model: Optional[str] = None
litellm_model_name: Optional[str] = None
tags: List[str] = []
model_id: Optional[str] = None
api_base: Optional[str] = None
api_provider: Optional[str] = None
exception_status: Optional[str] = None
exception_class: Optional[str] = None
status_code: Optional[str] = None
class Config:
fields = {
"end_user": {"alias": UserAPIKeyLabelNames.END_USER},
"user": {"alias": UserAPIKeyLabelNames.USER},
"hashed_api_key": {"alias": UserAPIKeyLabelNames.API_KEY_HASH},
"api_key_alias": {"alias": UserAPIKeyLabelNames.API_KEY_ALIAS},
"team": {"alias": UserAPIKeyLabelNames.TEAM},
"team_alias": {"alias": UserAPIKeyLabelNames.TEAM_ALIAS},
"requested_model": {"alias": UserAPIKeyLabelNames.REQUESTED_MODEL},
"model": {"alias": UserAPIKeyLabelNames.v1_LITELLM_MODEL_NAME},
"litellm_model_name": {"alias": UserAPIKeyLabelNames.v2_LITELLM_MODEL_NAME},
"model_id": {"alias": UserAPIKeyLabelNames.MODEL_ID},
"api_base": {"alias": UserAPIKeyLabelNames.API_BASE},
"api_provider": {"alias": UserAPIKeyLabelNames.API_PROVIDER},
"exception_status": {"alias": UserAPIKeyLabelNames.EXCEPTION_STATUS},
"exception_class": {"alias": UserAPIKeyLabelNames.EXCEPTION_CLASS},
"status_code": {"alias": UserAPIKeyLabelNames.STATUS_CODE},
}
same issue. reverted back to 1.55.12 as a workaround.
Same issue.
Yup
getting this same issue working with Raglite
It is extremely annoying to get that warning on each run, is there a simple way to somehow disable this specific single warning?
Temporary workaround, not a good idea, but works for now
import warnings
# TODO: Remove after https://github.com/BerriAI/litellm/issues/7560 is fixed
warnings.filterwarnings("ignore", category=UserWarning, module="pydantic._internal._config")
My datapoint: I've integrated litellm in Hugging Face smolagents, it creates a very cumbersome warning in my lib. Please fix it!
same issue
I'm experiencing the issue too.
same here
Submitted a fix here https://github.com/BerriAI/litellm/pull/8088
Thank you everyone the work on this. Running @chrisgoddard's PR through ci/cd - will work on making sure it's on main by EOD
@SmartManoj your test for catching these issues was quite good. I'll try and add that in as well
I think this issue should be closed as fixed by:
- #8096
It is in fact fixed
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.