opencode
opencode copied to clipboard
feat(openai): promptCacheRetention settings
https://platform.openai.com/docs/guides/prompt-caching#extended-prompt-cache-retention
Wanted a way to add the 24h extended prompt caching when using OpenAI through API key.
Not sure if should warn the user if they are using a model that doesn't support it. But it is a power user feature so they hopefully know what they are doing.
On the docs it supports only the following models: gpt-5.1 gpt-5.1-codex gpt-5.1-codex-mini gpt-5.1-chat-latest gpt-5 gpt-5-codex gpt-4.1
But these are already outdated (ex: gpt-5.1-codex-max), so I have left any warnings to be up to the user.
Verified via mitmproxy: