litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Feature]: Support the openrouter/anthropic/claude-3.7-sonnet:thinking

Open kang-dev123 opened this issue 10 months ago • 2 comments

The Feature

Support the openrouter/anthropic/claude-3.7-sonnet:thinking

Motivation, pitch

Support the openrouter/anthropic/claude-3.7-sonnet:thinking

Are you a ML Ops Team?

No

Twitter / LinkedIn details

No response

kang-dev123 avatar Mar 14 '25 09:03 kang-dev123

It seems I was able to make it work on my litellm proxy:

  1. disable drop_params
  2. make sure you have the :thinking version of the openrouter model
  3. Add to the model settings in the proxy_server_config.yaml:
      reasoning: {
        "max_tokens": 8192
      }
  1. Make sure you have an up to date litellm proxy. The latest image: ghcr.io/berriai/litellm:main-stable is 3 months old so I switched to main-latest.

It seems to work fine and because I disabled drop_params I think it is working!

thiswillbeyourgithub avatar Mar 14 '25 13:03 thiswillbeyourgithub

Thank you for your reply. I will try it tomorrow.

---- Replied Message ---- | From | @.> | | Date | 03/14/2025 21:25 | | To | BerriAI/litellm @.> | | Cc | KANGCONG @.>, Author @.> | | Subject | Re: [BerriAI/litellm] [Feature]: Support the openrouter/anthropic/claude-3.7-sonnet:thinking (Issue #9238) |

It seems I was able to make it work on my litellm proxy:

disable drop_params make sure you have the :thinking version of the openrouter model Add to the model settings in the proxy_server_config.yaml: reasoning: { "max_tokens": 8192 }

Make sure you have an up to date litellm proxy. The latest image: ghcr.io/berriai/litellm:main-stable is 3 months old so I switched to main-latest.

It seems to work fine and because I disabled drop_params I think it is working!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

thiswillbeyourgithub left a comment (BerriAI/litellm#9238)

It seems I was able to make it work on my litellm proxy:

disable drop_params make sure you have the :thinking version of the openrouter model Add to the model settings in the proxy_server_config.yaml: reasoning: { "max_tokens": 8192 }

Make sure you have an up to date litellm proxy. The latest image: ghcr.io/berriai/litellm:main-stable is 3 months old so I switched to main-latest.

It seems to work fine and because I disabled drop_params I think it is working!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

kang-dev123 avatar Mar 14 '25 13:03 kang-dev123