crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

[BUG] Response Format JSON for Azure OpenAi Models

Open patricktu2 opened this issue 9 months ago • 2 comments

Description

I'm triyng to setup a crew / agents based on Azure OpenAI models. I'm getting parsing errors and hence want to use json mode.

Trying to setup llms with response format leads me to an error:

llm = LLM(
    model=f"azure/{SECRETS['AZURE_OPENAI_DEPLOYMENT_NAMES'][0]}",
    api_key=SECRETS['AZURE_OPENAI_API_KEY'],
    api_base=SECRETS['AZURE_OPENAI_ENDPOINT'],
    api_version="2025-01-01-preview",
    response_format={"type": "json_object"}
)

regulatory_researcher_agent = Agent(
    config=agents_config['regulatory_researcher_agent'],
    llm=llm
)
...

Error:

 Error during LLM call: The model azure/gpt-4o does not support response_format for provider 'azure'. Please remove response_format or use a supported model.
 An unknown error occurred. Please check the details below.
 Error details: The model azure/gpt-4o does not support response_format for provider 'azure'. Please remove response_format or use a supported model.

I am using gpt-4o-2024-11-20 which according to documentation should support json mode. Why is this not working? Could you enable it for the supported AzureOpenAI models? Or how can I use it for the output of my agents?

Image

Steps to Reproduce

  1. Deploy AzureOpenAi Model supporting JSON Mode e.g. gpt-4o-2024-11-20
  2. Setup
llm = LLM(
    model=f"azure/{SECRETS['AZURE_OPENAI_DEPLOYMENT_NAMES'][0]}",
    api_key=SECRETS['AZURE_OPENAI_API_KEY'],
    api_base=SECRETS['AZURE_OPENAI_ENDPOINT'],
    api_version="2025-01-01-preview",
    response_format={"type": "json_object"}
)
# random crew
...

Expected behavior

Agents returning parsed json objects

Screenshots/Code snippets

Image

Operating System

macOS Sonoma

Python Version

3.11

crewAI Version

0.102.0

crewAI Tools Version

Virtual Environment

Poetry

Evidence

Possible Solution

Add support for specifc azure openai models for json response

Additional context

patricktu2 avatar Feb 28 '25 10:02 patricktu2

Could you update the implementation to exclude the default condition for all Azure models? This would ensure that models supporting response_format can use it without restriction. Thanks!

romellfudi avatar Mar 14 '25 23:03 romellfudi

@patricktu2 have you tried what @romellfudi said?

lucasgomide avatar Apr 11 '25 21:04 lucasgomide

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar May 12 '25 12:05 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar May 18 '25 12:05 github-actions[bot]