litellm
litellm copied to clipboard
[Bug]: Anthropic Function Streaming doesn't pass "stream" parameter.
What happened?
It appears the library fails to pass the "stream" parameter to Anthropic when creating streaming messages with tooling.
The attached log comes from running the test_acompletion_claude_3_function_call_with_streaming
function.
Notice the stream parameter is not passed within the CURL command. As such, the response does not stream. As a result, the response isn't a streaming response, and so streaming tool calls do not work.
Manually re-running the curl command with '"stream": true' appended to the end of the payload corrects the issue.
A little digging suggests that litellm is not setup to handle streaming function calls at all, given the strings "input_json_delta" and "partial_json" are present in the anthropic streaming output, but are found nowhere inside the litellm source code.
In the meantime, it might be worth updating the documentation to reflect the lack of support, and making the test_acompletion_claude_3_function_call_with_streaming
test fail due to failing to stream the response.
Relevant log output
Request to litellm:
litellm.acompletion(model='claude-3-opus-20240229', messages=[{'role': 'user', 'content': "What's the weather like in Boston today in fahrenheit?"}], tools=[{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}], tool_choice='auto', stream=True)
self.optional_params: {}
ASYNC kwargs[caching]: False; litellm.cache: None; kwargs.get('cache'): None
Final returned optional params: {'stream': True, 'tools': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}]}
self.optional_params: {'stream': True, 'tools': [{'type': 'function', 'function': {'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'parameters': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}}]}
POST Request Sent from LiteLLM:
curl -X POST \
https://api.anthropic.com/v1/messages \
-H 'accept: application/json' -H 'anthropic-version: 2023-06-01' -H 'content-type: application/json' -H 'x-api-key: no-way-jose-********************' -H 'anthropic-beta: tools-2024-05-16' \
-d '{'model': 'claude-3-opus-20240229', 'messages': [{'role': 'user', 'content': [{'type': 'text', 'text': "What's the weather like in Boston today in fahrenheit?"}]}], 'tools': [{'name': 'get_current_weather', 'description': 'Get the current weather in a given location', 'input_schema': {'type': 'object', 'properties': {'location': {'type': 'string', 'description': 'The city and state, e.g. San Francisco, CA'}, 'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']}}, 'required': ['location']}}], 'max_tokens': 4096}'
Twitter / LinkedIn details
No response