autogen icon indicating copy to clipboard operation
autogen copied to clipboard

Skip SSL verification option missing while creating model client in v0.4.x

Open avinashmihani opened this issue 10 months ago • 4 comments

What happened?

Describe the bug I was using v0.2.x earlier an agentic implementation. The agent to interact with LLM, was expecting LLM_CONFIG, which had an option to skip the SSL verification. Now in v0.4.x, the model client initiation does not have an option for skipping verification.

To Reproduce To reproduce, use in v0.2 as below:

class MyHttpClient(httpx.Client):
    def __deepcopy__(self, memo):
        return self

llm_config = {
    "config_list": [
        {
            "model": "some-model",
            "api_key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
            "base_url": "https://genai-api.example.com/v1",
            "http_client": MyHttpClient(verify=False),
        }
    ],
}

In v0.4, model client is initiated as below.

model_client = OpenAIChatCompletionClient(
            model=model,
            api_key=api_key,
            base_url=base_url,
            model_capabilities={"vision": True, "function_calling": True, "json_output": True}
        )

Expected behavior It is expected that in v0.4 to have an option within OpenAIChatCompletionClient module like llm_config in v0.2 to handle http_client.

Sample below:

class MyHttpClient(httpx.Client):
    def __deepcopy__(self, memo):
        return self

model_client = OpenAIChatCompletionClient(
            model=model,
            api_key=api_key,
            base_url=base_url,
            model_capabilities={"vision": True, "function_calling": True, "json_output": True},
            http_client=MyHttpClient(verify=False)
        )

Screenshots Below is the error:

httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1006)

Additional context We need to bypass SSL verification for the custom model integration with our agents implementation from local machine.

Which packages was the bug in?

Python AgentChat (autogen-agentchat>=0.4.0)

AutoGen library version.

Python 0.4.7

Other library version.

No response

Model used

No response

Model provider

None

Other model provider

No response

Python version

None

.NET version

None

Operating system

None

avinashmihani avatar Mar 03 '25 15:03 avinashmihani

Thanks for the issue. I think we can fix this by adding http_client to the list of BaseOpenAIClientConfiguration:

https://github.com/microsoft/autogen/blob/main/python/packages/autogen-ext/src/autogen_ext/models/openai/config/init.py#L35-L45

@avinashmihani can you submit a PR to fix this?

ekzhu avatar Mar 03 '25 23:03 ekzhu

@ekzhu I have added the PR for the change you mentioned.

avinashmihani avatar Mar 04 '25 05:03 avinashmihani

hi bro, when will this issue be closed? @ekzhu We are looking forward to this new feature, thanks :)

zijianding avatar Apr 09 '25 00:04 zijianding

hi bro, when will this issue be closed? @ekzhu We are looking forward to this new feature, thanks :)

If you are looking forward to use it can you submit a Pr? The original Pr is blocked due to a missing CLA

ekzhu avatar Apr 09 '25 00:04 ekzhu