zed
zed copied to clipboard
Implement Groq or OS models for Assistant Tab (settings.assistant)
Check for existing issues
- [X] Completed
Describe the feature
Currently, the Assistant Tab is limited to using the OpenAI API, which can lead to some challenges for open source projects. To provide more flexibility, I suggest adding a configurable field to change the base_url for the assistant tab. This would allow users to access free AI services like Groq, or alternatively use Ollama, LiteLLM, or other models by specifying their base_url. However, I want to note that I haven't fully vetted or tested these alternative options yet, so some additional checks may be needed before implementing this.
If applicable, add mockups / screenshots to help present your vision of the feature
It would be great to have a window or dialog similar to the request panel, where users can:
- Enter the base_url for alternative AI models/services
- Provide names and settings for custom models
- Save and manage multiple configurations
The dialog could have a dropdown to select pre-defined options (OpenAI, Groq, Ollama, etc.) as well as the ability to add new custom entries with a friendly name, base_url, and required parameters.
This would allow easy switching between different AI models/providers without manually entering URLs/settings every time, providing a seamless experience for developers/users exploring various AI capabilities. No response
Need this!
Yeah, love this thing in ST
Hope to support custom models, such as
"assistant": {
"version": "1",
"provider": {
"name": "openai",
"type": "openai",
"default_model": "gpt-custom-model-1",
"custom_model": ["gpt-custom-model-1", "gpt-custom-model-2"],
"api_url": "https://www.example.com"
}
},
When?
I'd like to take this on!
As of #12902, Ollama is now supported as a way to interface with local language models.