litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: Groq's distil-whisper-large-v3-en not working with litellm_stable_release_branch-v1.72.0.rc

Open stronk7 opened this issue 7 months ago • 6 comments

What happened?

Was testing litellm_stable_release_branch-v1.72.0.rc docker image, because it fixes a problem with gpt-image-1 for us and, when running the standard tests I've started to get this test failing.

The test is passing with litellm_stable_release_branch-v1.71.1-stable.

This is the configuration:

  - model_name: "whisper-distil-groq"
    litellm_params:
      model: "groq/distil-whisper-large-v3-en"
      api_key: "os.environ/GROQ_API_KEY"
      model_info:
        mode: audio_transcription

And this is the test:

curl --request POST \
  --url http://0.0.0.0:4000/v1/audio/transcriptions \
  --header 'authorization: Bearer {{bearer_token}}' \
  --header 'content-type: multipart/form-data' \
  --form [email protected] \
  --form model=whisper-distil-groq

That, when working, returns:

{
  "text": " Hello, how are you?"
}

Relevant log output

{
  "error": {
    "message": "litellm.BadRequestError: GroqException - unknown param `GROQ_TRANSCRIPTION_PARAMS[]`. Received Model Group=whisper-distil-groq\nAvailable Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2",
    "type": "None",
    "param": "None",
    "code": "500"
  }
}

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.72.0.rc

Twitter / LinkedIn details

No response

stronk7 avatar Jun 02 '25 10:06 stronk7

Same problem here. Downgrade to 1.71.1 works fine

jerry-intrii avatar Jun 11 '25 06:06 jerry-intrii

Confirmed. Not only with English language.

bgeneto avatar Jun 12 '25 00:06 bgeneto

For the records, I've just tried with litellm_stable_release_branch-v1.72.2-stable and it continues failing with same error reported above.

As a temporal workaround, what I've done is to add a fallback from groq to openai, and that's working till this is fixed:

fallbacks:
    - ...
    - ...
    - {"whisper-distil-groq": ["whisper-1-openai"]}

Ciao :-)

stronk7 avatar Jun 15 '25 09:06 stronk7

Same as #11402 (unsolved)

This error occurs when using GroqCloud with both openai-compatible provider (OpenAIException - unknown param OPENAI_TRANSCRIPTION_PARAMS[]) as in:

  - model_name: whisper-large-v3-turbo
    litellm_params:
      model: openai/whisper-large-v3-turbo
      api_key: os.environ/GROQ_API_KEY
      api_base: 'https://api.groq.com/openai/v1'
      additional_drop_params: ["GROQ_TRANSCRIPTION_PARAMS", "OPENAI_TRANSCRIPTION_PARAMS"]
      allowed_openai_params: ["model", "response_format"]

or directly via groq provider (GroqException - unknown param GROQ_TRANSCRIPTION_PARAMS[]):

  - model_name: whisper-large-v3-turbo
    litellm_params:
      model: groq/whisper-large-v3-turbo
      api_key: os.environ/GROQ_API_KEY

The worst part is that neither allowed_openai_params nor additional_drop_params actually works as a workaround. I've got GroqException - unknown param additional_drop_params[], it seems to be a different a bug in using additional_drop_params with litellm proxy.

So until someone merges a PR, transcription with Groq is pretty much dead in the water.

bgeneto avatar Aug 23 '25 15:08 bgeneto

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

github-actions[bot] avatar Nov 22 '25 00:11 github-actions[bot]

I think that this can be safely closed as far as distil-whisper-large-v3-en was deprecated and removed in Aug 2025.

Also, I can confirm that both groq/whisper-large-v3 and groq/whisper-large-v3-turbo (the currently available ones) are working ok.

Also, for cross-linking, when trying the same within an (arm64) docker container, I've got #16920.

But yes, exactly this issue, can be closed.

Ciao :-)

stronk7 avatar Nov 22 '25 09:11 stronk7