litellm
litellm copied to clipboard
[Bug]: Anthropic `v1/messages?beta=true` return Invalid chunk type.
What happened?
Description
When using LiteLLM proxy as a gateway with the v1/messages?beta=true endpoint in streaming mode, I've encountered an issue where litellm.anthropic_messages returns dict objects and endpoints.py directly in the stream, causing errors.
Reproduction Steps
- Start a LiteLLM proxy server
- Send the following curl request:
curl -X POST http://localhost:4000/v1/messages?beta=true \
-H "content-type: application/json" \
-H "accept: application/json" \
-d '{
"model": "claude-3-5-haiku-20241022",
"max_tokens": 512,
"messages": [
{
"role": "user",
"content": "Hi"
}
],
"system": [
{
"type": "text",
"text": "You are an AI Agent"
}
],
"temperature": 0,
"stream": true
}'
Relevant log output
Traceback (most recent call last):
File "/root/litellm/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/root/litellm/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
return await self.app(scope, receive, send)
File "/root/litellm/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/root/litellm/.venv/lib/python3.10/site-packages/starlette/applications.py", line 113, in __call__
await self.middleware_stack(scope, receive, send)
File "/root/litellm/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 187, in __call__
raise exc
File "/root/litellm/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 165, in __call__
await self.app(scope, receive, _send)
File "/root/litellm/.venv/lib/python3.10/site-packages/starlette/middleware/base.py", line 185, in __call__
with collapse_excgroups():
File "/root/.local/share/uv/python/cpython-3.10.17-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File "/root/litellm/.venv/lib/python3.10/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
raise exc
File "/root/litellm/.venv/lib/python3.10/site-packages/starlette/responses.py", line 255, in wrap
await func()
File "/root/litellm/.venv/lib/python3.10/site-packages/starlette/responses.py", line 246, in stream_response
chunk = chunk.encode(self.charset)
AttributeError: 'dict' object has no attribute 'encode'
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.69.1 commit: b7fc72628cdb424c11d03f2c53ea3b2f545d3c18
Twitter / LinkedIn details
No response
Debug Log:
LiteLLM Proxy:DEBUG: endpoints.py:35 - async_data_generator: received streaming chunk - {'type': 'message_start', 'message': {'id': 'msg_bdrk_01BoAVQFrH5M7dZzej9rNr8m', 'type': 'message', 'role': 'assistant', 'model': 'claude-3-5-sonnet-20240620', 'content': [], 'stop_reason': None, 'stop_sequence': None, 'usage': {'input_tokens': 13, 'output_tokens': 4}}}
@stillfox-lee does it only happen with beta=true ?
@stillfox-lee does it only happen with beta=true ?
No, the same issue occurs even without the beta or beta=false.
Hmm i'm unable to repro this @stillfox-lee
tried your exact request
curl -X POST http://localhost:4000/v1/messages \
-H "content-type: application/json" \
-H "accept: application/json" \
-H "Authorization: Bearer sk-1234" \
-d '{
"model": "claude-3-5-haiku-20241022",
"max_tokens": 512,
"messages": [
{
"role": "user",
"content": "Hi"
}
],
"system": [
{
"type": "text",
"text": "You are an AI Agent"
}
],
"temperature": 0,
"stream": true
}'