tangyang

Results 5 comments of tangyang

> You need to set LLM_MODEL too. https://litellm.vercel.app/docs/providers/gemini > > Run this to check whether LLM is working properly. > > ```python > import tomllib as toml > from litellm...

it's very strange, i have tested the gemin api,this proved the gemin api can be properly connected,below is the code and output: from litellm import completion import os os.environ['GEMINI_API_KEY'] =...

> Pass `-e LLM_MODEL="gemini/gemini-pro"` in docker command many thanks, i have tried this, but it didn't work, the same error keep there. i just input how are u ? in...

i found the docker keep call the openai interface,Instead of the gemini interface.

Ok, thanks for all the help here. I have runned the opendevin successfully, the docker command should be writed as follow using the gemini API. docker run \ -e GEMINI_API_KEY...