ag2 icon indicating copy to clipboard operation
ag2 copied to clipboard

[Bug]: `AgentOptimizer` error when using `LLMConfig`

Open giorgossideris opened this issue 7 months ago • 4 comments

Describe the bug

When an LLMConfig object is passed as argument of the AgentOptimizer constructor, it is not accessed. This is because the constructor handles LLMConfig as a dict:

if self.llm_config in [{}, {"config_list": []}, {"config_list": [{"model": ""}]}]:
            raise ValueError(
                "When using OpenAI or Azure OpenAI endpoints, specify a non-empty 'model' either in 'llm_config' or in each config of 'config_list'."
            )
self.llm_config["config_list"] = filter_config(llm_config["config_list"], {"model": [self.optimizer_model]})
self._client = OpenAIWrapper(**self.llm_config)

This issue is related to #1774, and to address both I think that ConversableAgent's _validate_llm_config method should be classmethod of LLMConfig in order to generalize it for use cases outside agents.

Steps to reproduce

from autogen import LLMConfig
from autogen.agentchat.contrib.agent_optimizer import AgentOptimizer

llm_config = LLMConfig.from_json(path="OAI_CONFIG_LIST")
optimizer = AgentOptimizer(llm_config=llm_config, max_actions_per_step=3, optimizer_model="gpt-4o")

Raises:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

giorgossideris avatar May 05 '25 12:05 giorgossideris

Hi @giorgossideris: Thanks for flagging this issue and sharing your suggestion.

@marklysze and @davorrunje, would appreciate your thoughts on the proposed approach. @giorgossideris, if the feedback looks good, would you be open to submitting a PR with the fix?

harishmohanraj avatar May 05 '25 12:05 harishmohanraj

cc @kumaranvpl

harishmohanraj avatar May 06 '25 00:05 harishmohanraj

@skzhang1 Please take a look!

qingyun-wu avatar May 06 '25 01:05 qingyun-wu

Related to #1946

Lancetnik avatar Aug 06 '25 20:08 Lancetnik