gateway
gateway copied to clipboard
[Feature] remove temperature param in case of reasoning models like o1 and o3
What Would You Like to See with the Gateway?
Some libraries like Vercel AI send a default temperature: 0 value, that breaks the calls to reasoning models of OpenAI, currently there is no way of remove that. Is posible to ignore it or remove that param in the gateway?
Context for your Request
No response
Your Twitter/LinkedIn
No response
hey @carlosveloso the gateway does not transform the requests sent to openai as the gateway is supposed to be openai compliant. I mean we could remove the param for specific models, but I don't think it's a great idea ig
we'll support deleting keys using the x-portkey-config header
closing as stale + not planned