[Bug]: Unexpected argument error when using gpt-images-1 model with OpenAI
What happened?
When attempting to use the gpt-images-1 model via the OpenAI provider in LiteLLM, the following error is thrown:
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncImages.generate() got an unexpected keyword argument 'output_format'
Reproduction Steps
- Configure LiteLLM to use the
gpt-images-1model from OpenAI. - Send a request through the LiteLLM interface.
- Observe the crash with the unexpected keyword argument error.
Expected Behavior
The gpt-images-1 model should be handled gracefully, or LiteLLM should explicitly indicate that it is not supported if that's the case.
Environment
- Provider: OpenAI
- Model:
gpt-images-1
Additional Notes
This model might not conform to the same API structure as chat/completion models. It could require a special-case handling or should be excluded with a clear error.
Let me know if you need help testing a fix or workaround.
Relevant log output
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 4550, in aimage_generation
response = await init_response # type: ignore
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py", line 1271, in aimage_generation
raise e
File "/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py", line 1254, in aimage_generation
response = await openai_aclient.images.generate(**data, timeout=timeout) # type: ignore
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: AsyncImages.generate() got an unexpected keyword argument 'output_format'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3381, in async_function_with_fallbacks
raise original_exception
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3237, in async_function_with_fallbacks
response = await self.async_function_with_retries(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3615, in async_function_with_retries
raise original_exception
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3506, in async_function_with_retries
response = await self.make_call(original_function, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3624, in make_call
response = await response
^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1884, in _aimage_generation
raise e
File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1871, in _aimage_generation
response = await response
^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1460, in wrapper_async
raise e
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1321, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/main.py", line 4557, in aimage_generation
raise exception_type(
~~~~~~~~~~~~~~^
model=model,
^^^^^^^^^^^^
...<3 lines>...
extra_kwargs=kwargs,
^^^^^^^^^^^^^^^^^^^^
)
^
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2222, in exception_type
raise e # it's already mapped
^^^^^^^
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 469, in exception_type
raise APIConnectionError(
...<7 lines>...
)
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncImages.generate() got an unexpected keyword argument 'output_format'
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
main-latest
Twitter / LinkedIn details
No response
Same problem if you try to specify moderation (supported only by gpt-image-1).
And to be clear, with no extra params, just model and prompt, it works.
Hi @rodrigo-lurnova are you calling the /chat/completions endpoint here ? can you give me a clear way to repro this
@ishaan-jaff is other params like background and output_compression supported? Because when I pass background I get similar error mentioned above except LiteLLM doesn’t crash. Is there a way to drop these params if they are not supported by LiteLLM?
what error do you see @danieldjupvik ?
@ishaan-jaff 500 litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncImages.generate() got an unexpected keyword argument 'background'. Received Model Group=gpt-image-1 Available Model Group Fallbacks=None
Hi @rodrigo-lurnova are you calling the /chat/completions endpoint here ? can you give me a clear way to repro this Hi @ishaan-jaff, thanks for addressing my issue. We are using the /images/generations endpoint.
@rodrigo-lurnova - Is there any model by name gpt-images-1
get the same error msg from Librechat
Something went wrong when trying to generate the image. The OpenAI API may be unavailable:
Error Message: 500 litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncImages.generate() got an unexpected keyword argument 'background'. Received Model Group=gpt-image-1
Available Model Group Fallbacks=None
@rodrigo-lurnova - Is there any model by name
gpt-images-1
Yes, there is a model by the name gpt-images-1.
This seems similar to a dall-e-3 error I am seeing: [ERROR: litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncImages.generate() got an unexpected keyword argument 'drop_params'. Received Model Group=dall-e-3 Available Model Group Fallbacks=None]
any update on this?
Getting TypeError: ImageResponse.init() got an unexpected keyword argument 'background' error If i try to use gpt-image-1 as model without adding any extra properties.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.