OpenAI models lists incompatible models
OpenAI models lists all available models, but goose only supports v1/chat/completions
To Reproduce:
- I verified the exact name to use in the model listing using goose configure
- start session like $ GOOSE_MODEL=gpt-5-codex goose session starting session | provider: openai model: gpt-5-codex ...
Goose is running! Enter your instructions, or try asking what goose can do.
Context: ○○○○○○○○○○ 0% (0/400000 tokens)
( O)>
Expected behavior Expect to only show the supported models in the listing. So the listing needs to be aware of the api to use.
Screenshots If applicable, add screenshots to help explain your problem.
Please provide following information:
- OS & Arch: [e.g. Ubuntu 22.04 x86]
- Interface: [CLI]
- Version: [e.g. v1.9.3]
- Extensions enabled: [e.g. Computer Controller, Figma]
- Provider & Model: [OpenAI gpt-5-codex]
Additional context Wanting to use goose more for coding, but at this point it is not possible using goose this way, so i have to revert to codex cli. I think in general goose developer is faster in pinpointing specific files, so i wanted to see how the token useage compares between goose cli and codex cli.
https://github.com/block/goose/issues/5270 will resolve this.
Looking into it
I"d suggest @katzdave to be working on this and that issue
Sounds good @DOsinga
@katzdave I don't believe they have a machine readable public spec for the Responses API but I did get some advice from an OpenAI engineer on this to:
Try the context7 MCP and point it at this https://context7.com/openai/completions-responses-migration-pack
Just passing along in case it helps get started on the conversion