'ValidationException when calling the InvokeModelWithResponseStream operation' in Open-WebUI 0.1.125 using Claude 3 on AWS Bedrock
Bug Report
Description
Bug Summary:
When using AWS Bedrock and Claude 3 Sonnet model all prompts are now raising a validationException error in Open-WebUI 0.1.125.
Note that it works well in Open-WebUI 0.1.123, however when using the 0.1.125 builds any prompt attempt results in these errors.
An error occurred (validationException) when calling the InvokeModelWithResponseStream operation: messages: text content blocks must be non-empty
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 3077, in async_data_generator
async for chunk in response:
File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 10447, in __anext__
raise e
File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 10379, in __anext__
chunk = next(self.completion_stream)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/botocore/eventstream.py", line 603, in __iter__
parsed_event = self._parse_event(event)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/botocore/eventstream.py", line 619, in _parse_event
raise EventStreamError(parsed_response, self._operation_name)
botocore.exceptions.EventStreamError: An error occurred (validationException) when calling the InvokeModelWithResponseStream operation: messages: text content blocks must be non-empty
Same error raise both for chats and within the playground.
The 3-5 word title of the chat gets generated however.
Environment
- Open WebUI Version: [e.g., 0.1.125]
- Operating System: Official Docker image from 0.1.125 release.
LiteLLM configured with the bedrock/anthropic.claude-3-sonnet-20240229-v1:0 model and langfuse callbacks.
litellm/config.yaml
model_list:
- model_name: Claude 3 Sonnet
litellm_params:
model: bedrock/anthropic.claude-3-sonnet-20240229-v1:0
max_tokens: 200000
litellm_settings:
success_callback: ["langfuse"]
failure_callback: ["langfuse"]
Reproduction Details
Confirmation:
- [x] I have read and followed all the instructions provided in the README.md.
- [x] I am on the latest version of both Open WebUI and Ollama.
- [x] I have included the browser console logs.
- [x] I have included the Docker container logs.
Logs and Screenshots
Browser Console Logs:
The error seems to be returned as part of the chat history.
Docker Container Logs:
The fastapi logs doesn't show much. Requests respond with 200.
INFO: 172.30.254.67:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO: 172.30.254.67:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO: 172.30.254.67:0 - "POST /api/v1/chats/96b6ca9c-6861-4fed-be3d-fa86b77f0ade HTTP/1.1" 200 OK
INFO: 172.30.254.67:0 - "POST /litellm/api/v1/chat/completions HTTP/1.1" 200 OK
Screenshots (if applicable):
Installation Method
Docker image.
Can you show us which docker image your're running?
docker image ls | grep open-webui
@justinh-rahb Errors are when using the ghcr.io/open-webui/open-webui:0.1.125 image. The older ghcr.io/open-webui/open-webui:0.1.123 image works.
@justinh-rahb Errors are when using the
ghcr.io/open-webui/open-webui:0.1.125image. The olderghcr.io/open-webui/open-webui:0.1.123image works.
@kalaspuff There were a few fixes subsequent to the release of the 0.1.125 tag, you'll need to pull ghcr.io/open-webui/open-webui:main to ensure you're on the actual latest version of 0.1.125
This is why I asked for the docker image so I could compare the SHA256 hash, but I know now you're running an older one.
Same issue on main (hash digest 9ce6b38670a239fc575e75f1070b74e162afa318836a3af18d72d47d1a8f4f73). I'll try out with an updated LiteLLM later on, as https://github.com/BerriAI/litellm/pull/2780 seems related.
Due dilligence out of the way, we'll need to dig deeper into this it seems.