litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: Anthropic messages provider config not found for model: us.anthropic.claude-3-7-sonnet-20250219-v1:0

Open mrh-chain opened this issue 7 months ago • 2 comments

What happened?

I am trying to hook up Claude Code to LiteLLM Proxy.

I've got LiteLLM Proxy running, with the following config:

    model_list:
    - model_name: "claude-3.7-sonnet"
      litellm_params:
        model: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
        aws_region_name: us-east-1
        model_id: "arn:aws:bedrock:us-east-1:[REDACTED]:application-inference-profile/[REDACTED]"
        input_cost_per_token: 0.000003
        output_cost_per_token: 0.000015
    - model_name: "claude-3-7-sonnet-20250219"
      litellm_params:
        model: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
        aws_region_name: us-east-1
        model_id: "arn:aws:bedrock:us-east-1:[REDACTED]:application-inference-profile/[REDACTED]"
        input_cost_per_token: 0.000003
        output_cost_per_token: 0.000015

If I make a request with an OpenAI provider to the LiteLLM Proxy and specify model as "claude-3.7-sonnet" everything works as expected.

But given that Claude Code uses the Anthropic SDK, and hardcodes the model as "claude-3-7-sonnet-20250219", I tried adding that as a model backed by the same AWS Bedrock instance profile.

However, making requests with the Anthropic SDK to the LiteLLM Proxy server, specifying model "claude-3-7-sonnet-20250219", I get a error 500 back, with the message Anthropic messages provider config not found for model: us.anthropic.claude-3-7-sonnet-20250219-v1:0".

Do I need to specify something in the model settings for it to work when called via the Anthropic routes?

Relevant log output

ERROR: endpoints.py:241 - litellm.proxy.proxy_server.anthropic_response(): Exception occured - Anthropic messages provider config not found for model: us.anthropic.claude-3-7-sonnet-20250219-v1:0
Traceback (most recent call last):
  File "/usr/local/lib/python3.13/site-packages/litellm/proxy/anthropic_endpoints/endpoints.py", line 190, in anthropic_response
    response = await llm_response
               ^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 3134, in async_wrapper
    return await self._ageneric_api_call_with_fallbacks(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<2 lines>...
    )
    ^
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2498, in _ageneric_api_call_with_fallbacks
    raise e
  File "/usr/local/lib/python3.13/site-packages/litellm/router.py", line 2485, in _ageneric_api_call_with_fallbacks
    response = await response  # type: ignore
               ^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/utils.py", line 1460, in wrapper_async
    raise e
  File "/usr/local/lib/python3.13/site-packages/litellm/utils.py", line 1321, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.13/site-packages/litellm/llms/anthropic/experimental_pass_through/messages/handler.py", line 110, in anthropic_messages
    raise ValueError(
        f"Anthropic messages provider config not found for model: {model}"
    )
ValueError: Anthropic messages provider config not found for model: us.anthropic.claude-3-7-sonnet-20250219-v1:0

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.67.3

Twitter / LinkedIn details

No response

mrh-chain avatar Apr 24 '25 17:04 mrh-chain

I should perhaps also note that calling the claude-3-7-sonnet-20250219 model via an OpenAI SDK works without issue.

mrh-chain avatar Apr 24 '25 17:04 mrh-chain

Maybe your issue is how the Anthropic sdk you are using is trying to execute against an Anthropic's request when hitting the litellm proxy instead of the general request and following the bedrock logic: proxy/anthropic_endpoints/endpoints.py. Can you see what the REST request from the sdk is trying to call (you should get some logs in litellm proxy to help)?

wagnerjt avatar May 12 '25 18:05 wagnerjt

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

github-actions[bot] avatar Aug 11 '25 00:08 github-actions[bot]

@mrh-chain I tried it out in my local but cannot reproduce it.

Here is my config file same as yours. I created the Application Inference Profile using us.anthropic.claude-3-7-sonnet-20250219-v1:0

    model_list:
    - model_name: "claude-3.7-sonnet"
      litellm_params:
        model: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
        aws_region_name: us-east-1
        model_id: "arn:aws:bedrock:us-east-1:<aws-account-id>:application-inference-profile/<aip-id>"
        input_cost_per_token: 0.000003
        output_cost_per_token: 0.000015
    - model_name: "claude-3-7-sonnet-20250219"
      litellm_params:
        model: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
        aws_region_name: us-east-1
        model_id: "arn:aws:bedrock:us-east-1:<aws-account-id>:application-inference-profile/<aip-id>"
        input_cost_per_token: 0.000003
        output_cost_per_token: 0.000015

When I test both models, they are working well:

import os
from anthropic import Anthropic

client = Anthropic(base_url="http://0.0.0.0:4000/", 
                   api_key=<my-bedrock-api-key>
        )

message = client.messages.create(
    max_tokens=1024,
    messages=[
        {
            "role": "user",
            "content": "Hello, Claude",
        }
    ],
    model="claude-3.7-sonnet",
)

> [TextBlock(citations=None, text="Hello! How can I assist you today? I'm here to help with whatever questions or topics you'd like to discuss.", type='text')]
import os
from anthropic import Anthropic

client = Anthropic(base_url="http://0.0.0.0:4000/", 
                   api_key=<my-bedrock-api-key>
        )

message = client.messages.create(
    max_tokens=1024,
    messages=[
        {
            "role": "user",
            "content": "Hello, Claude",
        }
    ],
    model="claude-3-7-sonnet-20250219",
)
print(message.content)

> [TextBlock(citations=None, text="Hello! It's nice to meet you. How can I help you today? I'm ready to assist with information, answer questions, have a conversation, or help with various tasks. What would you like to talk about?", type='text')]

Can you provide sample test code that can reproduce the error? Thank you

0x-fang avatar Aug 14 '25 22:08 0x-fang

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

github-actions[bot] avatar Nov 14 '25 00:11 github-actions[bot]