mem0 icon indicating copy to clipboard operation
mem0 copied to clipboard

openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Open anstonjie opened this issue 1 year ago • 10 comments

🐛 Describe the bug

openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

anstonjie avatar Jul 22 '24 13:07 anstonjie

Hello @anstonjie Could You share the Error Screeshoot. I thing You didn't set the OpenAI Key in environment or you can pass it by CLI also.

skstanwar avatar Jul 22 '24 17:07 skstanwar

+1

Trying out the example listed in the docs: https://docs.mem0.ai/llms#togetherai

import os
from mem0 import Memory
from dotenv import load_dotenv

load_dotenv()

config = {
    "llm": {
        "provider": "together",
        "config": {
            "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
            "temperature": 0.2,
            "max_tokens": 1500,
        }
    }
}

m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})

I have TOGETHER_API_KEY set in my .env, but I'm getting:

Traceback (most recent call last):
  File "/Users/secureconnectionforguest/Desktop/Python/NJITScheduleSniffer/app/llm_memory.py", line 39, in <module>
    m = Memory.from_config(config)
  File "/usr/local/lib/python3.10/site-packages/mem0/memory/main.py", line 103, in from_config
    return cls(config)
  File "/usr/local/lib/python3.10/site-packages/mem0/memory/main.py", line 69, in __init__
    self.embedding_model = EmbedderFactory.create(self.config.embedder.provider)
  File "/usr/local/lib/python3.10/site-packages/mem0/utils/factory.py", line 43, in create
    embedder_instance = load_class(class_type)()
  File "/usr/local/lib/python3.10/site-packages/mem0/embeddings/openai.py", line 8, in __init__
    self.client = OpenAI()
  File "/usr/local/lib/python3.10/site-packages/openai/_client.py", line 105, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

rshah713 avatar Jul 22 '24 17:07 rshah713

I try to run this code: https://docs.mem0.ai/llms#google-ai

import os from mem0 import Memory

os.environ["GEMINI_API_KEY"] = "your-api-key"

config = { "llm": { "provider": "litellm", "config": { "model": "gemini/gemini-pro", "temperature": 0.2, "max_tokens": 1500, } } }

m = Memory.from_config(config) m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})

i have a GEMINI_API_KEY. but i got this error:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

jalihui avatar Jul 25 '24 03:07 jalihui

use GEMINI_API_KEY, same error occur!

wangmingjunabc avatar Jul 25 '24 07:07 wangmingjunabc

use Azure_API_KEY, same error !

SpaceLearner avatar Jul 29 '24 14:07 SpaceLearner

use GROQ_API_KEY, same error!!

amostsai avatar Jul 30 '24 02:07 amostsai

Hey @amostsai @SpaceLearner @wangmingjunabc @jalihui @rshah713 @anstonjie The error is caused because default embedding model is set as OpenAI.
Try using different embedder model. PR #1627 might be helpful for using different embedding model.

kmitul avatar Aug 01 '24 05:08 kmitul

@kmitul thanks for the PR, docs will definitely need to be updated to allow users to use the different supported models properly. In my example I was trying to use provider: together, what embeddings model would I need to use with this provider? Can you give examples?

rshah713 avatar Aug 01 '24 14:08 rshah713

Hey @rshah713 You can choose any embedding model and pair it with any LLM, as they can be selected independently. Ideally, you would use an embedding model from TogetherAI. However, I haven't added support for TogetherAI embedding models yet, but I plan to do so soon. In the meantime, you can use any huggingface model. For example:

config = {
    "llm": {
        "provider": "together",
        "config": {
            "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
            "temperature": 0.2,
            "max_tokens": 1500,
        }
    },
    "embedder": {
        "provider": "huggingface",
        "config": {
            "model": "multi-qa-MiniLM-L6-cos-v1"
        }
    }
}

kmitul avatar Aug 02 '24 02:08 kmitul

Hey @anstonjie @rshah713 @wangmingjunabc @SpaceLearner @amostsai Sorry for the inconvenience, docs will be updated soon so it will be more clear.

Here we use Openai as embedding model that's why it needs Openai key and we will soon add option to choose your embedding model.

Dev-Khant avatar Aug 02 '24 14:08 Dev-Khant

use Azure_API_KEY, same error !

dosnshcu avatar Sep 02 '24 05:09 dosnshcu

@Dev-Khant any update on this? The docs for Azure openai are completely outdated here. see config.llm.provider: here.

uahmad235 avatar Sep 11 '24 09:09 uahmad235

Hey @uahmad235 Yes this is fixed now. Please check: https://docs.mem0.ai/components/llms/models/azure_openai. Please let me know if you still face any issue.

Dev-Khant avatar Sep 13 '24 16:09 Dev-Khant

Closing as issue solved. Please feel free to reopen if faced with same problem. Thanks.

Dev-Khant avatar Sep 13 '24 16:09 Dev-Khant