[Bug]: Invalid Options "Images" Error When Accessing Ollama/Llava Model via liteLLM Proxy
What happened?
When attempting to use the liteLLM proxy to access the Ollama/Llava model, an error is encountered: {"error":"invalid options: images"}.
How to replicate: sample from here https://docs.litellm.ai/docs/providers/ollama#ollama-vision-models fails:
import litellm
response = litellm.completion(
model = "ollama/llava",
messages=[
{
"role": "user",
"content": [
{
"type": "text",
"text": "Whats in this image?"
},
{
"type": "image_url",
"image_url": {
"url": "iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mNk+A8AAQUBAScY42YAAAAASUVORK5CYII="
}
}
]
}
],
)
print(response)
HTTP/1.1 500 Internal Server Error date: Sat, 10 Feb 2024 21:11:56 GMT server: uvicorn content-length: 99 content-type: application/json Connection: close
{ "error": { "message": "{"error":"invalid options: images"}", "type": null, "param": null, "code": 500 } }
Environment: litellm 1.23.8 ollama 0.1.6 Python 3.11.7
Relevant log output
Traceback (most recent call last):
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/llms/ollama.py", line 293, in ollama_acompletion
raise OllamaError(status_code=resp.status, message=text)
litellm.llms.ollama.OllamaError: {"error":"invalid options: images"}
Logging Details: logger_fn - None | callable(logger_fn) - False
Logging Details LiteLLM-Failure Call
get cache: cache key: 21-11:cooldown_models; local_only: False
in_memory_result: None
get cache: cache result: None
set cache: key: 21-11:cooldown_models; value: ['cb2b97c2-c0fd-4650-91bf-9b1155448c55']
Custom Logger - final response object: None
get cache: cache key: 21-11:cooldown_models; local_only: False
in_memory_result: None
get cache: cache result: None
set cache: key: 21-11:cooldown_models; value: ['cb2b97c2-c0fd-4650-91bf-9b1155448c55']
Custom Logger - final response object: None
Inside Max Parallel Request Failure Hook
Traceback (most recent call last):
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/main.py", line 275, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/llms/ollama.py", line 340, in ollama_acompletion
raise e
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/llms/ollama.py", line 293, in ollama_acompletion
raise OllamaError(status_code=resp.status, message=text)
litellm.llms.ollama.OllamaError: {"error":"invalid options: images"}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 2338, in chat_completion
response = await llm_router.acompletion(**data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 368, in acompletion
raise e
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 364, in acompletion
response = await self.async_function_with_fallbacks(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 950, in async_function_with_fallbacks
raise original_exception
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 873, in async_function_with_fallbacks
response = await self.async_function_with_retries(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 1060, in async_function_with_retries
raise original_exception
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 967, in async_function_with_retries
response = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 444, in _acompletion
raise e
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/router.py", line 423, in _acompletion
response = await litellm.acompletion(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/utils.py", line 2901, in wrapper_async
raise e
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/utils.py", line 2744, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/main.py", line 288, in acompletion
raise exception_type(
^^^^^^^^^^^^^^^
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/utils.py", line 7513, in exception_type
raise e
File "/Users/*****/miniconda3/envs/crewai/lib/python3.11/site-packages/litellm/utils.py", line 7481, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: {"error":"invalid options: images"}
INFO: 127.0.0.1:51456 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
Twitter / LinkedIn details
No response
This has hit me too as my code supports the vision model. Thanks for raising.