HeavenHM
HeavenHM
Is anyone working on PR for this ? or i can also solve this `.env` file issue here
If its working then please create **PR** for this
> @haseeb-heaven we already support passing images via URL + local storage for gemini ai studio. > > The missing one is blob, but i don't any code examples of...
> Great point - added here https://docs.litellm.ai/docs/providers/gemini Thanks. But need more documentation on code we need different code for different modes offline files and online files how to access then...
> good idea - feel free to contribute any examples to docs 😄 @krrishdholakia @ishaan-jaff Check this PR https://github.com/BerriAI/litellm/pull/1370
How much time it will take to review this ?
We can add support through **LM-Studio** and **Oolama**
> > ollama is already supported > > I haven’t it installed it because no where on the page it were mentioned that it is supported so I had no...
> LLMs can be run locally via Ollama as of now but the browser interaction and search would be impossible locally, if search can be eliminated in the step, this...
Okay thanks i will take a look at this issue.