[Bug]: drop_params not working for response_format
What happened?
The drop_params setting is not working for response_format. LiteLLM throws an error if the model doesn't support it:
Relevant log output
No response
Twitter / LinkedIn details
No response
Hey @toniengelhardt which model was this for? Looks like the error is raised by the api provider (not litellm)
@krrishdholakia I think it was Perplexity, but not 100% sure. I'll send the model when I see it again. But in general, I would assume drop_parameters means that parameters that are not supported by the API are not sent in the API request, or what is the logic there?
Aka. for my case, I would add response_format to the completion() call and expect that when the model supports it, it is sent, otherwise not.
makes sense @toniengelhardt
just bump me if you have a repro for this, i'll investigate on my end too
I haven't ran into this when using response_format on unsupported models.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
We experienced something similiar with extra_body input_type using bedrock/amazon.titan-embed-text-v2:0. Input type is supported on vertex embeddings, but we use both.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.