crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

[BUG]Error with memory=True using AzureOpenAI Provider

Open morhyak opened this issue 1 year ago • 9 comments

Description

Hi, I encountered an issue when using the AzureOpenAI provider. Setting memory=True results in a parameters error because it doesn't allow defining the embedder with additional parameters beyond model_name and provider. In my configuration for Azure embeddings, I need to include all parameters, including SSO access details. For example, the setup requires something like the following configuration:

"model": EMBEDDING_MODEL,
            "deployment": EMBEDDING_MODEL ,
            "openai_api_key":API_KEY,
            "azure_endpoint": API_BASE,
            "openai_api_version":API_VERSION,
            "openai_api_type":TYPE,
            "default_headers": {
                "Authorization": f"Bearer {access_token}",
                "Content-Type": "application/json"

Please let me know if there's a workaround or fix for this issue. Thanks

Steps to Reproduce

crew = Crew(
    agents=[support_agent, support_quality_assurance_agent],
    tasks=[inquiry_resolution, quality_assurance_review],
    verbose=True,
    memory=True,
    llm=llm,
    embedder={
        "provider": "azure_openai",
        "config": {
            "model": EMBEDDING_MODEL,
            "deployment": EMBEDDING_MODEL ,
            "openai_api_key":API_KEY,
            "azure_endpoint": API_BASE,
            "openai_api_version":API_VERSION,
            "openai_api_type":TYPE,
            "default_headers": {
                "Authorization": f"Bearer {access_token}",
                "Content-Type": "application/json"
            }
        }})

Expected behavior

I expected the crew to run successfully after being built using crew.kickoff.

Screenshots/Code snippets


Operating System

Windows 10

Python Version

3.11

crewAI Version

0.55.2

crewAI Tools Version

0.12.1

Virtual Environment

Venv

Evidence

AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Invalid Authorization token'}

Possible Solution

I attempted to modify the source code by adding additional parameter inputs for the embedder in misc.py and embedder-base.py. However, when running the kickoff, I still encounter an error related to SSO. AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Invalid Authorization token'}

Additional context

morhyak avatar Sep 29 '24 08:09 morhyak

first start with updated crewAI if you can:

pip install --upgrade crewai crewai-tools

CrewAi now uses LiteLLM

embedder for AOAI:

    embedder={
        "provider": "azure_openai",
        "config":{
            "model": "<model>",
            "deployment_name": "<dep name>",
        },        
    },  

And delete "OPENAI_API_BASE" env if exists: https://github.com/langchain-ai/langchain/discussions/17790#discussioncomment-8690960

voytas75 avatar Sep 29 '24 17:09 voytas75

... and prepare env, example:

os.environ.update({
    "AZURE_API_KEY": os.getenv("AZURE_OPENAI_API_KEY"),
    "AZURE_API_BASE": os.getenv("AZURE_OPENAI_ENDPOINT"),
    "AZURE_API_VERSION": os.getenv("AZURE_OPENAI_API_VERSION"),
})

voytas75 avatar Sep 29 '24 17:09 voytas75

Hi, thanks for your response. However, the SSO issue wasn't addressed. I need to define the Authorization tokens as follows:

  embedder={
        "provider": "azure_openai",
        "config": {
            "model": EMBEDDING_MODEL,
            "deployment": EMBEDDING_MODEL ,
            "openai_api_key":API_KEY,
            "azure_endpoint": API_BASE,
            "openai_api_version":API_VERSION,
            "openai_api_type":TYPE,
            "default_headers": {
                "Authorization": f"Bearer {access_token}",
                "Content-Type": "application/json"
            }
        }})

morhyak avatar Sep 30 '24 07:09 morhyak

Double check if you have azure_deployment in embedder config. error points that you use wrong arg name azure_deployment

voytas75 avatar Sep 30 '24 08:09 voytas75

After configuring all the parameters, I encountered an error related to the SSO issue.

AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Invalid Authorization token'}

morhyak avatar Sep 30 '24 08:09 morhyak

API_BASE is OK (https://YOUR_RESOURCE_NAME.openai.azure.com/)? Invalid Authorization token probably come from OpenAI. Azure OpenAI has another text Access denied due to invalid subscription key. Make sure to provide a valid key for an active subscription.

voytas75 avatar Sep 30 '24 09:09 voytas75

Shouldn't the Azure OpenAI base be like https://YOUR_ZONE.api.cognitive.microsoft.com/ ?

Edit: also the subscription key is an error when your computer is logged in to AD with another account/subscription than the Azure, check the documentation about "az login --use-device-code"

sorin-costea avatar Sep 30 '24 11:09 sorin-costea

No, https://learn.microsoft.com/en-us/azure/ai-services/openai/reference and Azure Cognitive Services APIs is other service. I do not think crewai by itself or liteLLM support Cognitive.

voytas75 avatar Sep 30 '24 18:09 voytas75

All the parameter configurations are correct (I double-checked them in another process). However, after adding extra options to the parameters in the config(for the sso tokens) , I am now encountering the following error:

ConnectError: [Errno 11001] getaddrinfo failed

morhyak avatar Oct 01 '24 07:10 morhyak

Hi, I have the same error using a different env. I am using Gemini in VertexAI from litellm. My crew runs with memory=False. When I set the memory to True, I have this error: AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: fake. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

I am doing:

crew = Crew( agents=[support_agent, support_quality_assurance_agent], tasks=[inquiry_resolution, quality_assurance_review], verbose=True, memory=True )

httplups avatar Oct 08 '24 13:10 httplups

No, https://learn.microsoft.com/en-us/azure/ai-services/openai/reference and Azure Cognitive Services APIs is other service. I do not think crewai by itself or liteLLM support Cognitive.

LiteLLM says it supports the cognitive endpoints already (see https://github.com/BerriAI/litellm/discussions/5995), and (as far as I can tell) the only way to use Azure OpenAI is with this Cognitive regional service endpoint. The resource-based endpoints are called legacy and less and less regions support them. I wasn't able to find a region supporting them but also didn't try with every region...

PS: could be all it needs is dependencies versions bump?

PPS: I get it running fine like this, so I guess the AzureChatOpenAI has something:

azure_llm = LLM(
    model="azure/deployment_name",
    base_url=regional_complete_deployment_url,
    api_key=api_key
)

sorin-costea avatar Oct 08 '24 13:10 sorin-costea

ok, maybe they are cooking :) In their docs https://docs.litellm.ai/docs/providers/azure there is no info about ...cognitive.microsoft.com... so does your code using cognitive.microsoft.com work on LiteLLM?

voytas75 avatar Oct 10 '24 19:10 voytas75

@voytas75 yes it works fine like in the example above. Only langchain's component doesn't, but I must say because its version is pinned by crewai I didn't try their latest - maybe it works in the meantime there too.

PS and you're right, nobody got that far to also document those new URLs.

sorin-costea avatar Oct 15 '24 14:10 sorin-costea

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Nov 15 '24 12:11 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Nov 21 '24 12:11 github-actions[bot]