MetaGPT icon indicating copy to clipboard operation
MetaGPT copied to clipboard

How to configure colab to use Azure OpenAI API

Open quartermaine opened this issue 10 months ago • 5 comments

I am trying to run the colab found on the documentation. How to set the env variables to use Azure OpenAI API? I can't find the

~/.metagpt/config2.yaml

to update with Azure OpenAI API env variables.

quartermaine avatar Feb 20 '25 08:02 quartermaine

可参考:#1690

seehi avatar Feb 20 '25 08:02 seehi

Hello @seehi,

Thanks for the quick response. I used the following to set the env variables

metagpt.const.API_KEY = userdata.get('OPENAI_API_KEY')
metagpt.const.MODEL = userdata.get('MODEL_NAME')
metagpt.const.API_TYPE = userdata.get('OPENAI_API_TYPE')
metagpt.const.API_VERSION = userdata.get('OPENAI_API_VERSION')
metagpt.const.BASE_URL = userdata.get('AZURE_OPENAI_ENDPOINT')

But when I run the cell

import asyncio

from metagpt.roles import (
    Architect,
    Engineer,
    ProductManager,
    ProjectManager,
)
from metagpt.team import Team

I am getting error

RecursionError                            Traceback (most recent call last)
[<ipython-input-5-46a1bd52cacd>](https://localhost:8080/#) in <cell line: 0>()
      1 import asyncio
      2 
----> 3 from metagpt.roles import (
      4     Architect,
      5     Engineer,

11 frames
... last 2 frames repeated, from the frame below ...

[/usr/local/lib/python3.11/dist-packages/metagpt/config.py](https://localhost:8080/#) in get_default_llm_provider_enum(self)
    112         if provider is LLMProviderEnum.GEMINI and not require_python_version(req_version=(3, 10)):
    113             warnings.warn("Use Gemini requires Python >= 3.10")
--> 114         model_name = self.get_model_name(provider=provider)
    115         if model_name:
    116             logger.info(f"{provider} Model: {model_name}")

RecursionError: maximum recursion depth exceeded while calling a Python object

quartermaine avatar Feb 20 '25 08:02 quartermaine

Such as:

from pathlib import Path

import metagpt.const

# 1. Change CONFIG_ROOT
metagpt.const.CONFIG_ROOT = Path.cwd()

# 2. Create config2.yaml
content = '''llm:
  api_type: "openai"  # or azure / ollama / open_llm etc. Check LLMType for more options
  model: "gpt-4-turbo-preview"  # or gpt-3.5-turbo-1106 / gpt-4-1106-preview
  base_url: "https://api.openai.com/v1"  # or forward url / other llm url
  api_key: "xxx"'''

with open('config2.yaml', 'w') as f:
    f.write(content)

seehi avatar Feb 20 '25 09:02 seehi

This issue has no activity in the past 30 days. Please comment on the issue if you have anything to add.

github-actions[bot] avatar Mar 23 '25 00:03 github-actions[bot]

I will check the proposed solution and let you know.

Thanks

quartermaine avatar Mar 28 '25 07:03 quartermaine

This issue has no activity in the past 30 days. Please comment on the issue if you have anything to add.

github-actions[bot] avatar May 22 '25 00:05 github-actions[bot]