aisuite icon indicating copy to clipboard operation
aisuite copied to clipboard

Structured Responses

Open johnisanerd opened this issue 11 months ago • 7 comments
trafficstars

I would like to suggest you support structured responses, in particular JSON responses. Are there any plans for structured responses?

johnisanerd avatar Nov 26 '24 06:11 johnisanerd

Hey @johnisanerd , Can you assign this feature to me?

Thank you.

ibrahim-string avatar Nov 26 '24 06:11 ibrahim-string

I also need a similar function. Thank you very much!

client = OpenAI(api_key=openai_api_key) client.beta.chat.completions.parse(...)

GarfieldHuang avatar Nov 27 '24 00:11 GarfieldHuang

Thanks for the request. Did you mean this https://platform.openai.com/docs/guides/structured-outputs ? "Structured outputs" is under client.beta, so not supporting it currently.

If you simply want json responses then you could use it like this -

aisuite - json response

rohitprasad15 avatar Nov 27 '24 05:11 rohitprasad15

to add on @rohitprasad15 answer, you can also specify a pydantic model to structure your output. Since it works mainly with the client.beta from openai, you just have to specify it in the system_message

messages = [
    {"role": "system", "content": """
     extract the event information using the `Event` pydantic model below and return a json format:

    class Event(BaseModel):
    '''a pydantic model to extract the event informations'''
     
    attendees: Optional[List[str]] = Field(default = None, description="a list of attendees")
    event_name: str | None = Field(default = None, description="the name of the event")
    event_date: str | None = Field(default = None, description="the date of the event")
     """},
    {"role": "user", "content": "Alice and Bob are going to a science fair on Friday, nov 1st 2024."}]

responses = []
for model in models2:
    response = client.chat.completions.create(
        model=model,
        messages=messages,
        temperature=0.0,
        response_format = {"type": "json_object"}
    )
    responses.append(response)

the output type is of str as shown below but can be easily parsed to json now:

model: openai:gpt-4o-mini-2024-07-18
response: {
  "attendees": ["Alice", "Bob"],
  "event_name": "science fair",
  "event_date": "2024-11-01"
}, type: <class 'str'>

model: openai:gpt-4o-2024-08-06
response: {
  "attendees": ["Alice", "Bob"],
  "event_name": "science fair",
  "event_date": "Friday, November 1st, 2024"
}, type: <class 'str'>

jonathanbouchet avatar Nov 27 '24 19:11 jonathanbouchet

Exactly: the two things you probably want to add to a prompt to increase the likelihood of compliance with a structure is the schema, and examples.

boxabirds avatar Nov 27 '24 23:11 boxabirds

Support for a json response enforcement is very important, in my opinion.

luisrock avatar Dec 03 '24 14:12 luisrock

I totally agree, it seems I am forced to workaround this by using tools, but I just want to enforce json format in the response, almost all LLM providers offer this parameter, anthropic, openai and gemini do for sure. But they do it in different ways, so isn't here where the magic of an integration layer like aisuite would shine ?

ghost avatar Mar 19 '25 16:03 ghost