[Bug]: Groq's distil-whisper-large-v3-en not working with litellm_stable_release_branch-v1.72.0.rc
What happened?
Was testing litellm_stable_release_branch-v1.72.0.rc docker image, because it fixes a problem with gpt-image-1 for us and, when running the standard tests I've started to get this test failing.
The test is passing with litellm_stable_release_branch-v1.71.1-stable.
This is the configuration:
- model_name: "whisper-distil-groq"
litellm_params:
model: "groq/distil-whisper-large-v3-en"
api_key: "os.environ/GROQ_API_KEY"
model_info:
mode: audio_transcription
And this is the test:
curl --request POST \
--url http://0.0.0.0:4000/v1/audio/transcriptions \
--header 'authorization: Bearer {{bearer_token}}' \
--header 'content-type: multipart/form-data' \
--form [email protected] \
--form model=whisper-distil-groq
That, when working, returns:
{
"text": " Hello, how are you?"
}
Relevant log output
{
"error": {
"message": "litellm.BadRequestError: GroqException - unknown param `GROQ_TRANSCRIPTION_PARAMS[]`. Received Model Group=whisper-distil-groq\nAvailable Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2",
"type": "None",
"param": "None",
"code": "500"
}
}
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.72.0.rc
Twitter / LinkedIn details
No response
Same problem here. Downgrade to 1.71.1 works fine
Confirmed. Not only with English language.
For the records, I've just tried with litellm_stable_release_branch-v1.72.2-stable and it continues failing with same error reported above.
As a temporal workaround, what I've done is to add a fallback from groq to openai, and that's working till this is fixed:
fallbacks:
- ...
- ...
- {"whisper-distil-groq": ["whisper-1-openai"]}
Ciao :-)
Same as #11402 (unsolved)
This error occurs when using GroqCloud with both openai-compatible provider (OpenAIException - unknown param OPENAI_TRANSCRIPTION_PARAMS[]) as in:
- model_name: whisper-large-v3-turbo
litellm_params:
model: openai/whisper-large-v3-turbo
api_key: os.environ/GROQ_API_KEY
api_base: 'https://api.groq.com/openai/v1'
additional_drop_params: ["GROQ_TRANSCRIPTION_PARAMS", "OPENAI_TRANSCRIPTION_PARAMS"]
allowed_openai_params: ["model", "response_format"]
or directly via groq provider (GroqException - unknown param GROQ_TRANSCRIPTION_PARAMS[]):
- model_name: whisper-large-v3-turbo
litellm_params:
model: groq/whisper-large-v3-turbo
api_key: os.environ/GROQ_API_KEY
The worst part is that neither allowed_openai_params nor additional_drop_params actually works as a workaround. I've got GroqException - unknown param additional_drop_params[], it seems to be a different a bug in using additional_drop_params with litellm proxy.
So until someone merges a PR, transcription with Groq is pretty much dead in the water.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
I think that this can be safely closed as far as distil-whisper-large-v3-en was deprecated and removed in Aug 2025.
Also, I can confirm that both groq/whisper-large-v3 and groq/whisper-large-v3-turbo (the currently available ones) are working ok.
Also, for cross-linking, when trying the same within an (arm64) docker container, I've got #16920.
But yes, exactly this issue, can be closed.
Ciao :-)