zerox
zerox copied to clipboard
OpenAI: Add support for Reasoning models (o1, o3, o4, etc)
Currently when using o1, ~~o2,~~ o3, o4 or mini models OpenAI returns a couple errors:
Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.
Expected behavior: ModelOptions must include reasoning models and when they are used, library passes different request params to OpenAI API.
Simple fix is to just adjust those params here or add a proper support with different properties for OpenAI Reasoning models.
If you want to use those now you can make a patch on compiled model.js module located node_modules/zerox/node-zerox/dist/utils/model.js.
Have a nice day :)