[Bug]: Issue switching model for an assistant
Description
When we switch model from a non reasoning (gpt-4o-mini) to a reasoning model (o1), it throw an exception Unsupported parameter: 'temperature', even so temperature is not part of the array we give to client->assistants()->modify . Without deleting the assistant, is there a workaround?
Steps To Reproduce
Switch the model for an existing assistant between non reasoning (gpt-4o-mini) to a reasoning model (o1)
OpenAI PHP Client Version
v0.10.3
PHP Version
8.1.31
Notes
No response
I couldn't replicate this. I started with an assistant and changed it back n forth. I quickly learned I had to send the valid reasoning_effort of null or non-null depending if a reasoning model. However, it worked once I did that as I went back n forth.
$result = $client->assistants()->modify("asst_xxx", [
'model' => 'gpt-4o-mini',
'reasoning_effort' => null,
]);
// changing back
$result = $client->assistants()->modify("asst_6xxx", [
'model' => 'o1',
'reasoning_effort' => 'low'
]);
Able to give me a little sample that still replicates an issue?
Closing as not reproducible.