feat - add vision chat for ollama_chat
Title
Implement vision chat with ollama_chat provider
Relevant issues
Fixes #6808
Type
🆕 New Feature
Changes
- Formatted the request to include image content for the
ollama_chatprovider.
[REQUIRED] Testing - Attach a screenshot of any new tests passing locall
If UI changes, send a screenshot/GIF of working UI fixes
Before the fix:
completion( model="ollama_chat/llava:7b", messages=[ { "role": "user", "content": [ {"type": "text", "text": "Whats in this image?"}, { "type": "image_url", "image_url": {"url": jpeg_image}, }, ], } ], api_base="http://localhost:11434")
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Traceback (most recent call last):
File "$HOME\litellm\litellm\main.py", line 2798, in completion
generator = ollama_chat.get_ollama_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "$HOME\litellm\litellm\llms\ollama_chat.py", line 325, in get_ollama_response
raise OllamaError(status_code=response.status_code, message=response.text)
litellm.llms.ollama_chat.OllamaError: {"error":"json: cannot unmarshal array into Go struct field ChatRequest.messages of type string"}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "$HOME\litellm\main.py", line 18, in <module>
response = completion(
^^^^^^^^^^^
File "$HOME\litellm\litellm\utils.py", line 960, in wrapper
raise e
File "$HOME\litellm\litellm\utils.py", line 849, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "$HOME\litellm\litellm\main.py", line 3059, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "$HOME\litellm\litellm\litellm_core_utils\exception_mapping_utils.py", line 2136, in exception_type
raise e
File "$HOME\litellm\litellm\litellm_core_utils\exception_mapping_utils.py", line 2105, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: Ollama_chatException - {"error":"json: cannot unmarshal array into Go struct field ChatRequest.messages of type string"}
After the fix it works fine.
Description by Korbit AI
What change is being made?
Add functionality for handling and converting images within the ollama_chat module by introducing the _convert_image function and testing this functionality with new unit tests.
Why are these changes being made?
This change enhances the ollama_chat module by allowing it to process inline images from user messages, converting non-JPEG/PNG formats to JPEG. This feature ensures better compatibility with the image handling requirements of the ollama platform. The included unit tests confirm that the image conversion works as expected, which is essential for maintaining robustness.
Is this description stale? Ask me to generate a new description by commenting
/korbit-generate-pr-description
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| litellm | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Nov 23, 2024 1:55pm |
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.