litellm icon indicating copy to clipboard operation
litellm copied to clipboard

[Bug]: GPT-5 stopped giving json output

Open arushkumarsingh opened this issue 1 month ago • 6 comments

What happened?

response = litellm.completion( model='gpt-5, messages=messages, response_format=AnalysisModel, temperature=1, )

Relevant log output

Error running structured completion: litellm.JSONSchemaValidationError: model=, returned an invalid response=1

[2 Items
0: {6 Items
id: "rs_0824027ffb43f3e000691d7e5cc7e081a0bc62f205311437e8"
summary: [0 Items
]
type: "reasoning"
content: null
encrypted_content: null
status: null
}
1: {5 Items
id: "msg_0824027ffb43f3e000691d7e8ce0a081a083c237da9e14c8d0"
content: [1 Items
0: {4 Items
annotations: [0 Items
]
text: "1. Summary
- Main theme: Reframe anxiety as a usable power and channel it into productivity through three practical techniques.
- Tone: friendly, energetic, professional
- Content type: tutorial
- Key topics:
  - Don’t fight anxiety; reframe it as information/power
  - Solution 1: Externalize thoughts—write them down and reality-check (“kya yeh 100% sach hai?”)
  - Solution 2: Body reset—fast walk, cold water sips, 10 deep breaths
  - Solution 3: Schedule worry time (e.g., 6:30 pm) and redirect

...expand (1453 more characters)
"
type: "output_text"
logprobs: [0 Items
]
}
]
role: "assistant"
status: "completed"
type: "message"
}
]

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.80.0

Twitter / LinkedIn details

No response

arushkumarsingh avatar Nov 19 '25 09:11 arushkumarsingh

same here litellm v1.74.15.post2

orange-fritters avatar Nov 19 '25 09:11 orange-fritters

Open ai changed the way of handling reasoning effort and the current reasoning_effort in completion doesn't work for open ai anymore. One way of handling it is passing the new reasoning structure in args.

AlirezaNadafN avatar Nov 19 '25 10:11 AlirezaNadafN

What is the new reasoning structure? How do I add it in the args?

arushkumarsingh avatar Nov 19 '25 10:11 arushkumarsingh

Calling directly OpenAI:

Bad request: `Error code: 400 - {'error': {'message': "'messages' must contain the word 'json' in some form, to use 'response_format' of type 'json_object'.", 'type': 'invalid_request_error', 'param': 'messages', 'code': None}}s`.

For me updating prompt message to include hint that output should be JSON fixed the issue.,

aleksas avatar Nov 19 '25 12:11 aleksas

For people that stumble across this, it's the same root case as this issue: https://github.com/BerriAI/litellm/issues/16810. Root cause is a cloud configuration file that was updated which is why this breaks across multiple versions of the LiteLLM library.

Current workaround is setting the environment variable LITELLM_LOCAL_MODEL_COST_MAP="True".

jverre avatar Nov 19 '25 17:11 jverre

Also reviwed this, quite a problematic issue. Someone is attempting a fix at https://github.com/BerriAI/litellm/pull/16844

vincentkoc avatar Nov 21 '25 06:11 vincentkoc