[Feature] Add Support for OpenAI o3 and o4-mini models
Problem Description ChatBox currently lacks support for OpenAI's latest o3 and o4-mini models.
Proposed Solution Integrate new model variants into ChatBox.
Additional Context Technical documentation references:
It's already supported in Chatbox AI provider now, for Custom OpenAI provider, is there any feature missing for you?
It seems to be impossible to talk to the new models because of the always enabled top p parameter:
{
"error": {
"message": "Unsupported parameter: 'top_p' is not supported with this model.",
"type": "invalid_request_error",
"param": "top_p",
"code": "unsupported_parameter"
}
}
For now, do not use custom openAI compatible provider for these models, use built-in OpenAI provider instead, it should work.
I don't see o4-mini in the list after entering my api key even though I'm eligible
(others coming to this thread may wanna check if these models are available in the openai playground. o3, for example, requires an account of higher tier).
I don't see o4-mini in the list after entering my api key even though I'm eligible
(others coming to this thread may wanna check if these models are available in the openai playground. o3, for example, requires an account of higher tier).
I have extractly the same issue, o4 mini does not appear. And if I force use custom model, it gives this error:
{ "error": { "message": "Unsupported value: 'temperature' does not support 0.45 with this model. Only the default (1) value is supported.", "type": "invalid_request_error", "param": "temperature", "code": "unsupported_value" } }
Same behavior here. o4-mini does not appear. If I add it as a custom model, I get:
{
"error": {
"message": "Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.",
"type": "invalid_request_error",
"param": "temperature",
"code": "unsupported_value"
}
}