Support "think" and "no think" params on Ollama
Thanks for the great work on the plugin!
Ollama now supports "think" and "no think" settings for reasoning models like DeepSeek, QWEN and others:
https://ollama.com/blog/thinking
Be great if this could be supported by Copilot!
R1:7b is one of the best local models for mainstream laptops but it overthinks terribly on the thinking mode, while it is surprisingly snappy and powerful when the no think mode is turned on.
I see, we are going to ship an offline mode for the plugin where everything agentic can run offline, Ollama is going to be the center of this mode. Stay tuned!
This would be incredibly helpful to have, for simple commands on my context menu I do not want to have the thinking output waste time and text.