Responses: reasoning.generate_summary interpreted as unsupported parameter
Confirm this is an issue with the Python library and not an underlying OpenAI API
- [x] This is an issue with the Python library
Describe the bug
When I pass in the reasoning.generate_summary argument (as 'detailed' or 'concise') I get a 400 error:
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'reasoning.generate_summary' is not supported with the 'o3-mini-2025-01-31' model.", 'type': 'invalid_request_error', 'param': 'reasoning.generate_summary', 'code': 'unsupported_parameter'}}
I've tried this with a few (reasoning) models, the same output seems to arise each time.
I can see in the response "reasoning=Reasoning(effort='low', generate_summary=None)" so it's definitely built in somewhere!
To Reproduce
Run a client.responses.create request with any reasoning model and generate_summary set to "detailed" or "concise"
Code snippets
from openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="o3-mini",
input="Can you outline how to determine the capital of France.",
reasoning={
"effort": "medium",
"generate_summary": "detailed"
}
)
print(response)
OS
macOS
Python version
python 3.12.2
Library version
openai v1.66.3