skyvern icon indicating copy to clipboard operation
skyvern copied to clipboard

Local deployment for Gemini Not working

Open goheesheng opened this issue 5 months ago • 5 comments

This is the env:

ENV='local'
ENABLE_OPENAI='false'
OPENAI_API_KEY=''
ENABLE_ANTHROPIC='false'
ANTHROPIC_API_KEY=''
ENABLE_AZURE='false'
AZURE_DEPLOYMENT=''
AZURE_API_KEY=''
AZURE_API_BASE=''
AZURE_API_VERSION=''
ENABLE_AZURE_GPT4O_MINI='false'
AZURE_GPT4O_MINI_DEPLOYMENT=''
AZURE_GPT4O_MINI_API_KEY=''
AZURE_GPT4O_MINI_API_BASE=''
AZURE_GPT4O_MINI_API_VERSION=''
ENABLE_GEMINI='true'
GEMINI_API_KEY='I have hidden this' # I have used the real API KEY
ENABLE_NOVITA='false'
NOVITA_API_KEY=''
LLM_KEY='GEMINI_2.5_PRO_PREVIEW'  #Why is not working?
SECONDARY_LLM_KEY=''
BROWSER_TYPE='chromium-headful'
MAX_SCRAPING_RETRIES='0'
VIDEO_PATH='./videos'
BROWSER_ACTION_TIMEOUT_MS='5000'
MAX_STEPS_PER_RUN='50'
LOG_LEVEL='INFO'
LITELLM_LOG='CRITICAL'
DATABASE_STRING='postgresql+psycopg://skyvern@localhost/skyvern'
PORT='8000'
ANALYTICS_ID='I have hidden this' # I have used the real API KEY
ENABLE_LOG_ARTIFACTS='false'
ENABLE_VOLCENGINE='false'
ENABLE_OPENAI_COMPATIBLE='false'
SKYVERN_BASE_URL='http://localhost:8000'
SKYVERN_API_KEY='I have hidden this'  # I have used the real API KEY

  File "C:\Users\test\Desktop\skyvern\skyvern\forge\sdk\api\llm\models.py", line 110, in dummy_llm_api_handler
    raise NotImplementedError("Your LLM provider is not configured. Please configure it in the .env file.")
NotImplementedError: Your LLM provider is not configured. Please configure it in the .env file.

goheesheng avatar Jul 20 '25 06:07 goheesheng

Could you try it without the quotation marks? I beleive .env files are sensitive to that

suchintan avatar Jul 22 '25 05:07 suchintan

Could you try it without the quotation marks? I beleive .env files are sensitive to that

Doesn't work, and shouldn't be the case either.

goheesheng avatar Jul 24 '25 10:07 goheesheng

Same issue for me, i tried different ways but always the llm error

Who-Code avatar Jul 30 '25 05:07 Who-Code

Same issue for me, i tried different ways but always the llm error

What LLM API key did you used?

I used GEMINI AND PERPLEXITY. Yet to try OPENAI.

goheesheng avatar Jul 30 '25 06:07 goheesheng

News from my side, i recreated the project and now it works.

I think i had a duplicate key as i prepared switching between models and i assume that my LLM_KEY was double.

My reduced and working .env looks like this (But with korrect api keys of cours:

ENABLE_GEMINI=true
GEMINI_API_KEY=XXXXXXXXXXXXXXXXXXXX
LLM_KEY=GEMINI_2.5_FLASH

BROWSER_TYPE='chromium-headful'
SKYVERN_BASE_URL='http://localhost:8000'
ANALYTICS_ID='7XXXXXX-XXXX-XXXX-XXXX-XXXXX'
SKYVERN_API_KEY='XXXXXXXXXXXXXXX'

I createt a python venv and prepared using the quickstart command.

Afterwards im starting the skyvern server by running

skyvern run server

Now i leave it running and in a new terminal in the same python venv i run my test script like in the documentation given.

In my opinion the issue could be closed.

Will cleanup my minimal testing project and push it as soon as i can so you others can reuse it if you want.

Who-Code avatar Aug 06 '25 16:08 Who-Code