litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: Claude-3.7 'thinking' parameter not support with litellm_proxy

Open cauchy221 opened this issue 9 months ago • 1 comments

What happened?

I'm using litellm_proxy and custom base_url. The request looks like this:

response = litellm.completion(
            api_key=os.environ.get("LITELLM_API_KEY"),
            base_url=MY_URL,
            model='litellm_proxy/xxx/claude-3-7-sonnet-20250219',
            messages=xxx,
            temperature=0.0,
            thinking={
                "type": "enabled",
                "budget_tokens": 200
            }
        )

I got UnsupportedParamsError:

litellm.exceptions.UnsupportedParamsError: litellm.UnsupportedParamsError: litellm_proxy does not support parameters: {'thinking': {'type': 'enabled', 'budget_tokens': 200}}, for model=xxx/claude-3-7-sonnet-20250219. To drop these, set `litellm.drop_params=True` or for proxy:

`litellm_settings:
 drop_params: true`

Relevant log output


Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.61.20

Twitter / LinkedIn details

No response

cauchy221 avatar Mar 01 '25 16:03 cauchy221

scoping

colesmcintosh avatar Mar 17 '25 17:03 colesmcintosh

@cauchy221 I was able to use Claude 3.7 Sonnet w/ thinking on a proxy server using this code:

import litellm

response = litellm.completion(
    api_key="sk-************",  # The master_key from config
    base_url="http://localhost:4000",
    model='claude-3-7-sonnet-20250219',
    messages=[{"role": "user", "content": "How can I solve world hunger?"}],
    max_tokens=2001,
    temperature=1,
    thinking={
        "type": "enabled",
        "budget_tokens": 2000
    }
)

print(response)

Important requirements for Claude thinking parameter:

  1. budget_tokens must be ≥ 1024
  2. When thinking is enabled, temperature must be set to 1

can you try this and see if it solves your issue?

colesmcintosh avatar Mar 19 '25 00:03 colesmcintosh

@colesmcintosh the user wants to pass model=litellm_proxy/<their-model>

since we specify it here: https://docs.litellm.ai/docs/providers/litellm_proxy

ishaan-jaff avatar Mar 19 '25 01:03 ishaan-jaff

Ahhh I see, was able to reproduce the error after adding litellm_proxy/ - checking now

colesmcintosh avatar Mar 19 '25 03:03 colesmcintosh

Same issue here. Happy to help test if a fix is in progress.

maszhongming avatar Mar 25 '25 17:03 maszhongming

Closing as this is now fixed.

krrishdholakia avatar Apr 28 '25 22:04 krrishdholakia

Closing as this is now fixed.

Thanks @krrishdholakia - could you confirm which commit/release this was fixed in?

graham33 avatar Apr 30 '25 10:04 graham33