Local only support
First of all amazing work and congratulations for winning the Google Chrome Built-in AI Challenge 🎉
I'd love to use this extension, but not really comfortable with using hosted LLMs.
It would be amazing if the extension had a layer of abstraction on the LLM and embedding model interactions and make it possible to use local models. Using OpenAI API compatible calls would make it possible to use Ollama, LM Studio, or most popular local inference tools. And it would of course also enable people to use their own OpenAI keys, so this would just need customisable base URL on a basic level.
I'd also be interested in this. At the very least, it should be quite straightforward to add an UI to allow adding a custom (gemini) key.
By the way, I’ve just opened an issue over at this link. It might be something you’d be interested in or could help with as well!