KAG icon indicating copy to clipboard operation
KAG copied to clipboard

Azure OpenAI

Open jamie93201 opened this issue 1 year ago • 6 comments

Search before asking

  • [X] I had searched in the issues and found no similar feature requirement.

Description

How to use Azure OpenAI? Thanks!

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

jamie93201 avatar Dec 30 '24 08:12 jamie93201

Search before asking

  • [x] I had searched in the issues and found no similar feature requirement.

Description

How to use Azure OpenAI? Thanks!

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

Duplicate issue,please refer to Are there any models other than deepseek-chat supported?

caszkgui avatar Dec 30 '24 09:12 caszkgui

<class 'RuntimeError'>: invalid llm config: {'api_key': 'ce3de18xxxxxxxxxxx', 'base_url': 'https://xxxxxxxx.openai.azure.com', 'model': 'gpt-4o', 'client_type': 'maas'},

for details: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

The config for Azure OpenAI is different.

jamie93201 avatar Dec 31 '24 06:12 jamie93201

<class 'RuntimeError'>: invalid llm config: {'api_key': 'ce3de18xxxxxxxxxxx', 'base_url': 'https://xxxxxxxx.openai.azure.com', 'model': 'gpt-4o', 'client_type': 'maas'},

for details: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

The config for Azure OpenAI is different.

You can test your model service accessibility throuth model Service Availability Test

image

But postfix of chat/completions should be removed when you type in the llm configuration.

caszkgui avatar Dec 31 '24 07:12 caszkgui

<class 'RuntimeError'>: invalid llm config: {'api_key': 'ce3de18xxxxxxxxxxx', 'base_url': 'https://xxxxxxxx.openai.azure.com', 'model': 'gpt-4o', 'client_type': 'maas'},

for details: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}

The config for Azure OpenAI is different.

I have the same problem. In order to use azureopenai, it needs to support additional parameters, such as AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_DEPLOYNAME. for more detail in https://learn.microsoft.com/en-us/azure/ai-services/openai/reference

Magicen0722 avatar Jan 02 '25 10:01 Magicen0722

Does this mean that we need to write the linking method for this model separately?

Magicen0722 avatar Jan 02 '25 10:01 Magicen0722

Does this mean that we need to write the linking method for this model separately?

It looks like Azure OpenAI is not compatible with openai interface, and KAG only support those model services which are compatible to openai.

you can customize LLMClient implementation of KAG in developer mode, or chose another openai compatible model service.

caszkgui avatar Jan 03 '25 09:01 caszkgui

Hello,

I tried to set up the example_config.yaml to use my Azure OpenAI API credentials but seem to be running into an issue:

(kag-demo) (base) user examples % knext project create --config_path ./example_config.yaml Done initialize project config with host addr http://127.0.0.1:8887 and project_id 1 No config found. Error: call AzureOpenAIClient.init failed, details:AzureOpenAI.init() got an unexpected keyword argument 'model'

Here's my edited config file:

#------------project configuration start----------------# openie_llm: &openie_llm api_key: "..." base_url: "https://....openai.azure.com/" model: "gpt-4o-mini" api_version: "2024-06-01" type: azure_openai

chat_llm: &chat_llm api_key: "..." base_url: "https://....openai.azure.com/" model: "gpt-4o-mini" api_version: "2024-06-01" type: azure_openai

vectorize_model: &vectorize_model api_key: "..." base_url: "https://.....openai.azure.com/" azure_deployment: "text-embedding-3-large" type: azure_openai vector_dimensions: 1536 vectorizer = *vectorize_model

log: level: INFO

project: biz_scene: default host_addr: http://127.0.0.1:8887 id: "1" language: en namespace: TwoWikiTest #------------project configuration end----------------#

josephmpariser avatar Feb 06 '25 18:02 josephmpariser

I was able to successfully connect and get a response from the Azure OpenAI API, I did this by tinkering with the openai_client.py file and changing self.client to

    self.client = AzureOpenAI(
        api_key=api_key,
        azure_endpoint=base_url,
        azure_deployment=azure_deployment,
        api_version=api_version,
    )

josephmpariser avatar Feb 07 '25 00:02 josephmpariser

Search before asking

  • [x] I had searched in the issues and found no similar feature requirement.

Description

How to use Azure OpenAI? Thanks!

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

This feature has been supported in #269

caszkgui avatar Apr 18 '25 08:04 caszkgui