Missleading documentation or incorrect behaviour about small_model
Description
Looking at the documentation for OpenCode at https://opencode.ai/docs/config/:
The small_model option configures a separate model for lightweight tasks like title generation. By default, OpenCode tries to use a cheaper model if one is available from your provider, otherwise it falls back to your main model.
However https://github.com/anomalyco/opencode/blob/253b7ea78403585db916dc2746d07f622015c597/packages/opencode/src/provider/provider.ts#L1109
The code shows that OpenCode first attempts to use its own opencode provider for the small model. Only if the opencode provider is unavailable does it fall back to the user's main configured model.
This discrepancy can lead to unexpected behavior. For example, when using deepseek-reasoner via the DeepSeek provider as a main model, OpenCode silently used opencode nano-5-gpt as the small model. This means session data was sent to OpenCode's servers without explicit notice, raising potential privacy concerns.
Plugins
No response
OpenCode version
No response
Steps to reproduce
No response
Screenshot and/or share link
No response
Operating System
No response
Terminal
No response