tabby icon indicating copy to clipboard operation
tabby copied to clipboard

Model for inline edit

Open sr-tream opened this issue 8 months ago • 1 comments

Please describe the feature you want

Split chat and inline edit model.

Motivation:

  1. inline edit do not support thinking models
  2. inline edits can be done with smaller models, what working faster

Additional context

inline edit with qwen3 model

Screenshot is demonstrated current state with inline edit, when used reasoning model. Reasoning models is very good for chat, but bad for inline editing

Workaround for current version

Add non-thinking model to supported_models and change model before inline editing. Example:

[model.chat.http]
kind = "openai/chat"
model_name = "qwen2.5-coder:3b"
api_endpoint = "http://127.0.0.1:11434/v1"
supported_models = ["qwen3", "rhundt/qwen3-64k:30b", "qwen2.5-coder:3b"]

NOTE: model_name is used for inline edit here, but as default used first model from supported_models.


Please reply with a 👍 if you want this feature.

sr-tream avatar May 04 '25 19:05 sr-tream

Certainly, thank you for the report. The issue with reasoning models is already on our radar. We will investigate it further in due course. In the meantime, please feel free to utilize the workaround as you've outlined.

zwpaper avatar May 06 '25 07:05 zwpaper