NeMo-Guardrails icon indicating copy to clipboard operation
NeMo-Guardrails copied to clipboard

Issue with azure openAI Api key

Open Pratekh opened this issue 10 months ago • 5 comments

response = rails.generate(messages=[{ "role": "user", "content": "What is the capital of France?" }]) print(response["content"])

WARNING: nemoguardrails.actions.action_dispatcher:Error while execution generate_user_intent: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

I'm sorry, an internal error has occurred.

Pratekh avatar Apr 25 '24 20:04 Pratekh

@Pratekh : can you provide more details on the config (e.g., the content of config.yml)? This is probably a configuration issue.

drazvan avatar Apr 26 '24 11:04 drazvan

%%writefile config/config.yml
type: main
engine: azure
model: gpt-35-turbo-16k
parameters:
    azure_endpoint: https://abc.openai.azure.com/
    api_version: 2023-07-01-preview
    deployment_name: abc
    api_key: 00000000000000000000000

Pratekh avatar Apr 30 '24 04:04 Pratekh

How can I fix this issue?

Pratekh avatar May 02 '24 07:05 Pratekh

Any update on this issue, we are also using Azure Open AI, getting same error while running nemoguardrails server

rohitk-cognizant avatar May 20 '24 10:05 rohitk-cognizant

@rohitk-cognizant : to debug this, can you try to get a functional code where you initialize the LLM separately, in a "pure LangChain way".

model = AzureOpenAI(...)
print(model.invoke("test prompt")

And then try to pass that directly to the LLMRails instance

rails = LLMRails(config=config, llm=model)
response = rails.generate(messages=[{
"role": "user",
"content": "What is the capital of France?"
}])
print(response["content"])

If this works, then I can point you to where exactly you can check how the AzureOpenAI engine is initialized and we can check the difference in parameters.

drazvan avatar May 20 '24 21:05 drazvan

Hey @drazvan , Chat models support AzureChatOpenAI() function from langchain.

Tried below code and it worked for me:

**from langchain_openai import AzureChatOpenAI from nemoguardrails import LLMRails, RailsConfig

llm = AzureChatOpenAI( model="gpt-4o", api_key="", azure_endpoint="", api_version="*******", ) print(llm.invoke("Tell me about Lung cancer?")) config = RailsConfig.from_path("./config") rails = LLMRails(config, llm=llm, verbose=True) try: res = rails.generate( messages=[ { "role": "user", "content": "Tell me about NSCLC?", } ], ) except Exception as e: res = None

ramchennuru avatar Oct 15 '24 16:10 ramchennuru

@drazvan Good to close this issue.

ramchennuru avatar Oct 15 '24 16:10 ramchennuru

Thank you @ramchennuru for confirmation 👍🏻

Pouyanpi avatar Oct 16 '24 06:10 Pouyanpi