LiteLLM does not find my openai key
Hello Community,
i am trying to use LLM based extraction for my crawler, but I run into a problem that I am unable to fix.
-
I installed Crawl4AI using Docker compose using a local Dockerfile. Here is the tutorial from your docs that I followed. This worked and I could make request to my local instance.
-
Then I tried to use LLM extraction, where the error occured. I cannot figure out how to add my openai api key to the local instance. I followed this tutorial from your docs
- I added
OPENAI_API_KEY=${OPENAI_API_KEY}to my docker compose file - I added
OPENAI_API_KEY=sk-proj....to my .env - I restarted my docker instance and run
echo $OPENAI_API_KEYin my Doktor Desktop Shell which returned my key, so my docker instance should have access to the key
- I added
-
But when running a simple LLM extraction, like shown in the documentation, i run into this error:
Error in thread execution: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: no-token. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
I cannot figure out how to resolve this and would appreciate help
I made a repo for reproduction: https://github.com/joostwmd/crawl4ai
@joostwmd Thanks for using the library. I need to make some changes in the library for compatibility with the new version of the lightLLM. Tomorrow, when I release the new Crawl4ai version, I will fix this, and you will be connected again.
@unclecode i think the link in the documentation is wrong, because the right one should be: https://github.com/BerriAI/litellm
@garethjax Thx for pointing out.
@aravindkarnam this need a test.
Running into the same issue...
Plan JSON (from LLM): [{'index': 0, 'error': True, 'tags': ['error'], 'content': "litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: no-token. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"}, {'index': 0, 'error': True, 'tags': ['error'], 'content': "litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: no-token. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"}]
Error: LLM returned a list, not a single dict. Possibly an error or chunk response.
Any luck?
Nope, I could not resolve this issue. I am installing it using a local docker file and pulling the image from the hub. The latest version was uploaded 3 months ago:
https://hub.docker.com/r/unclecode/crawl4ai/tags
I think we need to wait until a new docker build is available
Nope, I could not resolve this issue. I am installing it using a local docker file and pulling the image from the hub. The latest version was uploaded 3 months ago:
https://hub.docker.com/r/unclecode/crawl4ai/tags
I think we need to wait until a new docker build is available
I tried a few things and noticed when I added cache_mode the code worked without issues: Trying adding cache_mode=CacheMode.BYPASS to your CrawlerRunConfig.
plan_config = CrawlerRunConfig(
extraction_strategy=plan_strategy,
wait_for_images=True,
cache_mode=CacheMode.BYPASS
)
@garethjax We have a brand new docker setup now. Make sure to pull the latest with
docker pull unclecode/crawl4ai:latest
Here it explains how to setup environment variables including openai keys.
Here's the gist
- create a .llm.env file in your working directory:
- Add your keys as follows
# Create a .llm.env file with your API keys
cat > .llm.env << EOL
# OpenAI
OPENAI_API_KEY=sk-your-key
# Anthropic
ANTHROPIC_API_KEY=your-anthropic-key
# Other providers as needed
# DEEPSEEK_API_KEY=your-deepseek-key
# GROQ_API_KEY=your-groq-key
# TOGETHER_API_KEY=your-together-key
# MISTRAL_API_KEY=your-mistral-key
# GEMINI_API_TOKEN=your-gemini-token
EOL
- Run your container as follows
# Make sure .llm.env is in the current directory
docker run -d \
-p 11235:11235 \
--name crawl4ai \
--env-file .llm.env \
--shm-size=1g \
unclecode/crawl4ai:latest
I already followed the exact steps on the latest doc but still got the error: OpenAIException - You didn't provide an API key.
Based on this step, we have to explicitly comment out the environemnt: section. Otherwise, it will override the values from the env_file with an empty value if we don't export these keys to the shell which is a step not mentioned in the docs.
ports:
- "11235:11235" # Gunicorn port
env_file:
- .llm.env # API keys (create from .llm.env.example)
# environment:
# - OPENAI_API_KEY=${OPENAI_API_KEY:-}
# - DEEPSEEK_API_KEY=${DEEPSEEK_API_KEY:-}
# - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY:-}
# - GROQ_API_KEY=${GROQ_API_KEY:-}
# - TOGETHER_API_KEY=${TOGETHER_API_KEY:-}
# - MISTRAL_API_KEY=${MISTRAL_API_KEY:-}
# - GEMINI_API_TOKEN=${GEMINI_API_TOKEN:-}