[Bug]: Anthropic images fail with unified anthropic endpoint
What happened?
I am attempting to send the following message through the proxy to Claude 3 Opus:
[{
"role": "user",
"content": [{
"type": "text",
"text": "what is in this image?"
}]
}, {
"role": "assistant",
"content": [{
"type": "text",
"text": "I'm sorry, but no image has been uploaded to our conversation yet. Could you please try uploading an image and I'll do my best to describe what I see in it!"
}]
}, {
"role": "user",
"content": [{
"type": "image",
"source": {
"type": "base64",
"media_type": "image/png",
"data": "iVBORw0KGgoAAAANSUhEUgAABJ..."
}
}]
}]
This works fine through the normal Anthropic API, but through the proxy it fails with the error:
LiteLLM Proxy:ERROR: proxy_server.py:5812 - litellm.proxy.proxy_server.anthropic_response(): Exception occured - litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"messages.2.content.0.image.source.base64.media_type: Input should be 'image/jpeg', 'image/png', 'image/gif' or 'image/webp'"}}
Strangely, images sent through LibreChat work fine. For example:
{
"messages": [{
"role": "user",
"content": [{
"text": "explain this",
"type": "text"
},
{
"type": "image_url",
"image_url": {
"url": "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABJIAAAHMC...",
"detail": "auto"
}
}
]
}]
}
Relevant log output
19:42:12 - LiteLLM Proxy:ERROR: proxy_server.py:5812 - litellm.proxy.proxy_server.anthropic_response(): Exception occured - litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"messages.2.content.0.image.source.base64.media_type: Input should be 'image/jpeg', 'image/png', 'image/gif' or 'image/webp'"}}
Received Model Group=claude-3-opus
Available Model Group Fallbacks=None LiteLLM Retried: 2 times, LiteLLM Max Retries: 3
Traceback (most recent call last):
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/llms/anthropic/chat/handler.py", line 384, in acompletion_function
response = await async_handler.post(
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/llms/custom_httpx/http_handler.py", line 159, in post
raise e
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/llms/custom_httpx/http_handler.py", line 119, in post
response.raise_for_status()
File "/srv/litellm/venv/lib/python3.10/site-packages/httpx/_models.py", line 763, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/main.py", line 470, in acompletion
response = await init_response
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/llms/anthropic/chat/handler.py", line 403, in acompletion_function
raise AnthropicError(
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"invalid_request_error","message":"messages.2.content.0.image.source.base64.media_type: Input should be 'image/jpeg', 'image/png', 'image/gif' or 'image/webp'"}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 5725, in anthropic_response
response = await llm_response
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 1954, in aadapter_completion
raise e
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 1942, in aadapter_completion
response = await self.async_function_with_fallbacks(**kwargs)
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2729, in async_function_with_fallbacks
raise original_exception
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2587, in async_function_with_fallbacks
response = await self.async_function_with_retries(*args, **kwargs)
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2898, in async_function_with_retries
raise original_exception
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2812, in async_function_with_retries
response = await self.make_call(original_function, *args, **kwargs)
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2905, in make_call
response = await original_function(*args, **kwargs)
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2021, in _aadapter_completion
raise e
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/router.py", line 2008, in _aadapter_completion
response = await response # type: ignore
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/main.py", line 4287, in aadapter_completion
raise e
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/main.py", line 4270, in aadapter_completion
response: Union[ModelResponse, CustomStreamWrapper] = await acompletion(**new_kwargs) # type: ignore
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/utils.py", line 1227, in wrapper_async
raise e
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/utils.py", line 1083, in wrapper_async
result = await original_function(*args, **kwargs)
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/main.py", line 492, in acompletion
raise exception_type(
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2116, in exception_type
raise e
File "/srv/litellm/venv/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 469, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"messages.2.content.0.image.source.base64.media_type: Input should be 'image/jpeg', 'image/png', 'image/gif' or 'image/webp'"}}
Twitter / LinkedIn details
No response
@Cyberes please use the pass-through endpoint - https://docs.litellm.ai/docs/pass_through/anthropic_completion
LITELLM_PROXY_BASE_URL/anthropic
I'm getting {"detail":"Not Found"} for the following curl request:
curl --request POST \
--url https://llm.example.com/anthropic/v1/messages \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--header "Authorization: bearer sk-anything" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'
please bump versions to latest @Cyberes
I ran git pull and still am getting the 404: "POST /anthropic/v1/messages HTTP/1.1" 404 Not Found
I don't need to configure the passthrough endpoint, correct?
@Cyberes please use the pass-through endpoint - https://docs.litellm.ai/docs/pass_through/anthropic_completion
LITELLM_PROXY_BASE_URL/anthropic
What is the current positioning of the /v1/messages endpoint? I don't find a note about it in the documentation, is this endpoint deprecated?
@huangyafei use anthropic/v1/messages for the passthrough route to anthropic
the /v1/messages route is still in beta
@huangyafei use
anthropic/v1/messagesfor the passthrough route to anthropicthe
/v1/messagesroute is still in beta
Okay, I got it. Thank you.
However, I have two providers, Anthropic and Bedrock, and my clients use Anthropic Compatible Endpoints.
I want to use these two providers to load balance via proxy router, and I am mainly using the claude-3-5-sonnet model.
I don't know if there is a good solution for this.
@ishaan-jaff I'm on latest and getting the 404 and still have the original exception. I don't need to configure or enable the passthrough endpoint, correct?
@Cyberes your swagger should indicate if the pass-through route exists, like this - https://litellm-api.up.railway.app/#/Anthropic%20Pass-through
if you don't see it, you should confirm your version isn't cached (e.g. check on swagger if you're on v1.55.5)
Okay, looks like I'm on 1.51.0. How do I reset it?
bump versions, here's the latest tag - https://github.com/BerriAI/litellm/releases/tag/v1.55.4
Sorry, I got confused about a few things. It's working now, thanks for your help. I'll leave this issue open since the unified anthropic endpoint is still broken.
was this solved?
please update fix this. I am using the sdk and it is not working either.
I'm having this problem with 1.61.6, works fine through anthropic but not proxy. Same logs
Hi all,
for a unified endpoint - please use the openai /chat/completions route.
the unified anthropic route - is experimental.
For anthropic pass-through please use the /anthropic/* route
The unified Anthropic route seems to be working fine now.