crawl4ai icon indicating copy to clipboard operation
crawl4ai copied to clipboard

LiteLLM does not find my openai key

Open joostwmd opened this issue 11 months ago • 6 comments

Hello Community,

i am trying to use LLM based extraction for my crawler, but I run into a problem that I am unable to fix.

  1. I installed Crawl4AI using Docker compose using a local Dockerfile. Here is the tutorial from your docs that I followed. This worked and I could make request to my local instance.

  2. Then I tried to use LLM extraction, where the error occured. I cannot figure out how to add my openai api key to the local instance. I followed this tutorial from your docs

    1. I added OPENAI_API_KEY=${OPENAI_API_KEY} to my docker compose file
    2. I added OPENAI_API_KEY=sk-proj.... to my .env
    3. I restarted my docker instance and run echo $OPENAI_API_KEY in my Doktor Desktop Shell which returned my key, so my docker instance should have access to the key
  3. But when running a simple LLM extraction, like shown in the documentation, i run into this error:

Error in thread execution: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: no-token. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

I cannot figure out how to resolve this and would appreciate help

I made a repo for reproduction: https://github.com/joostwmd/crawl4ai

joostwmd avatar Jan 18 '25 18:01 joostwmd

@joostwmd Thanks for using the library. I need to make some changes in the library for compatibility with the new version of the lightLLM. Tomorrow, when I release the new Crawl4ai version, I will fix this, and you will be connected again.

unclecode avatar Jan 20 '25 12:01 unclecode

@unclecode i think the link in the documentation is wrong, because the right one should be: https://github.com/BerriAI/litellm

garethjax avatar Jan 28 '25 08:01 garethjax

@garethjax Thx for pointing out.

@aravindkarnam this need a test.

unclecode avatar Jan 28 '25 15:01 unclecode

Running into the same issue...

Plan JSON (from LLM): [{'index': 0, 'error': True, 'tags': ['error'], 'content': "litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: no-token. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"}, {'index': 0, 'error': True, 'tags': ['error'], 'content': "litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: no-token. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"}]
Error: LLM returned a list, not a single dict. Possibly an error or chunk response.

Any luck?

devopslondon2021 avatar Feb 20 '25 16:02 devopslondon2021

Nope, I could not resolve this issue. I am installing it using a local docker file and pulling the image from the hub. The latest version was uploaded 3 months ago:

https://hub.docker.com/r/unclecode/crawl4ai/tags

I think we need to wait until a new docker build is available

joostwmd avatar Feb 20 '25 16:02 joostwmd

Nope, I could not resolve this issue. I am installing it using a local docker file and pulling the image from the hub. The latest version was uploaded 3 months ago:

https://hub.docker.com/r/unclecode/crawl4ai/tags

I think we need to wait until a new docker build is available

I tried a few things and noticed when I added cache_mode the code worked without issues: Trying adding cache_mode=CacheMode.BYPASS to your CrawlerRunConfig.

plan_config = CrawlerRunConfig(
        extraction_strategy=plan_strategy,
        wait_for_images=True,
        cache_mode=CacheMode.BYPASS
    )

devopslondon2021 avatar Feb 20 '25 16:02 devopslondon2021

@garethjax We have a brand new docker setup now. Make sure to pull the latest with docker pull unclecode/crawl4ai:latest

Here it explains how to setup environment variables including openai keys.

Here's the gist

  1. create a .llm.env file in your working directory:
  2. Add your keys as follows
# Create a .llm.env file with your API keys
cat > .llm.env << EOL
# OpenAI
OPENAI_API_KEY=sk-your-key

# Anthropic
ANTHROPIC_API_KEY=your-anthropic-key

# Other providers as needed
# DEEPSEEK_API_KEY=your-deepseek-key
# GROQ_API_KEY=your-groq-key
# TOGETHER_API_KEY=your-together-key
# MISTRAL_API_KEY=your-mistral-key
# GEMINI_API_TOKEN=your-gemini-token
EOL
  1. Run your container as follows
# Make sure .llm.env is in the current directory
docker run -d \
  -p 11235:11235 \
  --name crawl4ai \
  --env-file .llm.env \
  --shm-size=1g \
  unclecode/crawl4ai:latest

aravindkarnam avatar May 07 '25 10:05 aravindkarnam

I already followed the exact steps on the latest doc but still got the error: OpenAIException - You didn't provide an API key.

Deanfei avatar Jul 11 '25 08:07 Deanfei

Image

Based on this step, we have to explicitly comment out the environemnt: section. Otherwise, it will override the values from the env_file with an empty value if we don't export these keys to the shell which is a step not mentioned in the docs.

  ports:
    - "11235:11235" # Gunicorn port
  env_file:
    - .llm.env # API keys (create from .llm.env.example)
  # environment:
  #   - OPENAI_API_KEY=${OPENAI_API_KEY:-}
  #   - DEEPSEEK_API_KEY=${DEEPSEEK_API_KEY:-}
  #   - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY:-}
  #   - GROQ_API_KEY=${GROQ_API_KEY:-}
  #   - TOGETHER_API_KEY=${TOGETHER_API_KEY:-}
  #   - MISTRAL_API_KEY=${MISTRAL_API_KEY:-}
  #   - GEMINI_API_TOKEN=${GEMINI_API_TOKEN:-}

Deanfei avatar Jul 11 '25 08:07 Deanfei