[Feature]: Add Support for Google Gemini API as an LLM Option
Which destkop app does this feature request relate to?
UI-TARS Desktop
What problem does this feature solve?
Support Gemini API
What does the proposed features look like?
Select Gemini in AI Modle Provider.
BTW
maybe put Settings here
Thank you for your feedback!Just to clarify, we already have GeminiProvider implemented (apps/agent-tars/src/main/llmProvider/providers/GeminiProvider.ts). However, it is currently not displayed in the user interface. Because of this, the effort required to complete the implementation should be relatively low.
If you're interested, contributions to improve or enhance this feature are always welcome! Feel free to share your thoughts or submit a pull request.
Ref: Implement DeepSeek model provider: https://github.com/bytedance/UI-TARS-desktop/pull/350
Note that even if Gemini provider is supported, the running stability is not guaranteed, see: https://agent-tars.com/doc/quick-start#compare-model-providers
Thank you for your feedback!Just to clarify, we already have
GeminiProviderimplemented (apps/agent-tars/src/main/llmProvider/providers/GeminiProvider.ts). However, it is currently not displayed in the user interface. Because of this, the effort required to complete the implementation should be relatively low.If you're interested, contributions to improve or enhance this feature are always welcome! Feel free to share your thoughts or submit a pull request.
Ref: Implement DeepSeek model provider: #350
Note that even if Gemini provider is supported, the running stability is not guaranteed, see: https://agent-tars.com/doc/quick-start#compare-model-providers
I've made the changes. Happy to submit PR.
Hi, is there support for Gemini API for UI TARS (not Agent TARS) ?
I am planning to test out computer use via Gemini API (2.5 Flash Preview) via UI TARS on my desktop, and was wondering if that's already possible.
Thank you for the awesome project