codex icon indicating copy to clipboard operation
codex copied to clipboard

azure responses gpt-5-codex 400 The encrypted content for item rs_68dxxx could not be verified

Open abazabaaa opened this issue 3 months ago • 21 comments

What version of Codex is running?

0.41.0

Which model were you using?

got-5-codex

What platform is your computer?

Linux 4.18.0-553.el8_10.x86_64 x86_64 x86_64

What steps can reproduce the bug?

╭───────────────────────────────────────────╮ │ >_ OpenAI Codex (v0.41.0) │ │ │ │ model: gpt-5-codex /model to change │ │ directory: /ap/bin │ ╰───────────────────────────────────────────╯

To get started, describe a task or try one of these commands:

/init - create an AGENTS.md file with instructions for Codex /status - show current session configuration /approvals - choose what Codex can do without approval /model - choose what model and reasoning effort to use

▌ please investigate the scripts dir

• I’m thinking about whether I need a plan for investigating the scripts directory. Maybe it's a good idea, but it might be simple enough to bypass that. I could just list the contents and report on them. So, first, I’ll check the directory and its size, ensuring I’m in the correct working directory—maybe /ap/xxxxx? I’ll just use shell commands like ls to gather the information I need.

⚠️ stream error: unexpected status 400 Bad Request: { "error": { "message": "The encrypted content for item rs_68d43ab0a6c0xxxxc8xxx could not be verified.", "type": "invalid_request_error", }; retrying 1/5 in 219ms…

⚠️ stream error: unexpected status 400 Bad Request: { "error": { "message": "The encrypted content for item rs_68d43ab0a6c08194axxxx could not be verified.", "type": "invalid_request_error", "param": null, "code": null } }; retrying 2/5 in 427ms…

• I need to check the contents of the scripts directory. Maybe I should come up with a plan: first, I'll list the files, then identify any interesting ones. I'm considering different methods, like using tree for a visual structure or find for searching specific files. I could also simply use ls for a straightforward list. There’s a lot to consider! Let's figure out the best approach.

• Explored └ List scripts

and sizes, and a tree command for an overview, though I should proceed cautiously.

• Explored └ List ls -la

⚠️ stream error: unexpected status 400 Bad Request: { "error": { "message": "The encrypted content for item rs_68d43ab0axxx could not be verified.", "type": "invalid_request_error", "param": null, "code": null } }; retrying 1/5 in 192ms…

⚠️ stream error: unexpected status 400 Bad Request: { "error": { "message": "The encrypted content for item rs_68d43ab0xxxx could not be verified.", "type": "invalid_request_error", }; retrying 2/5 in 368ms…

• Explored └ List ls -la List ls -la

What is the expected behavior?

model_provider = "azure" sandbox_mode = "danger-full-access"

[model_providers.azure] name = "Azure" base_url = "https://xxx-aigateway-xxx-nonprod.azu.xxx.com/openai" query_params = { api-version = "2025-04-01-preview" } wire_api = "responses" env_http_headers = { "api-key" = "AZURE_OPENAI_API_KEY" }

What do you see instead?

No response

Additional information

This seems to break codex with azure at present. The most recent build on main also has this issue. It will occasionally allow the model to progress through a turn, but a lot of the times it is stuck and repeats the commands over and over again.

abazabaaa avatar Sep 24 '25 18:09 abazabaaa

I think the bug is caused by the fact that our endpoint doesn't have one of these strings in it:

fn matches_azure_responses_base_url(base_url: &str) -> bool { let base = base_url.to_ascii_lowercase(); const AZURE_MARKERS: [&str; 5] = [ "openai.azure.", "cognitiveservices.azure.", "aoai.azure.", "azure-api.", "azurefd.", ]; AZURE_MARKERS.iter().any(|marker| base.contains(marker)) }

abazabaaa avatar Sep 24 '25 20:09 abazabaaa

same issue - macos codex-cli 0.41.0

model = any via API (after logging out from Pro account and selecting API because 5-day credits run out)

⚠️ stream error: unexpected status 400 Bad Request: {
  "error": {
    "message": "The encrypted content gAAA...o4o= could not be verified.",
    "type": "invalid_request_error",
    "param": null,
    "code": null
  }
}; retrying 1/5 in 194ms…

mgscox avatar Sep 26 '25 18:09 mgscox

I too logout from OpenAI account and select API, and it also happens to me
but I found out that it happens when I resume conversations that were started with the subscription account (I assume they are encrypted?)
so this issue resolves when starting a new conversaton with the azure API access

danielwolfman avatar Sep 29 '25 16:09 danielwolfman

I was able to reproduce this consistently. It happens when I try to resume a session that was originally created while using ChatGPT/Plus auth, after switching Codex to use an OpenAI API key (preferred_auth_method="apikey").

New sessions work fine with the API key, but resuming any pre-existing session throws

⚠️ stream error: unexpected status 400 Bad Request:
{ "error": { "message": "The encrypted content ... could not be verified." } }

Starting a fresh session resolves it.

So it looks like the resume token/session blob stored in the rollout JSON is bound to the previous auth context, and becomes invalid once the credential type changes.

bline avatar Oct 11 '25 01:10 bline

same thing happens to me with resumed session switching to api. pretty much kills the point of resuming.... doing a very dirty strip of the encrypted content (which isn't everything. dont know what it is - screenshots maybe?) cat rollout-2025-10-15T10-04-17-0199e82f-9acf-7ee3-ba1c-552dfecbec8d.jsonl |grep -v "encrypted_content" >| rollout-2025-10-16T10-04-17-0199e82f-9acf-7ee3-ba1c-552dfecbec8d.jsonl at least lets me resume the session with reasonable history...

this has nothing to do with azure, btw, this is just switching from plus/teams auth to api auth - which, btw, switching from plus/teams does not cause this failure - I do that every week since I pay for both.

keen99 avatar Oct 16 '25 14:10 keen99

Still having this issue too.

Manouchehri avatar Oct 28 '25 22:10 Manouchehri

Still having this issue too in 0.65.0 version.

lianneli avatar Dec 05 '25 08:12 lianneli

I am seeing this in version 0.73.0, and I am not using Azure. I am using the standard OpenAI logon (no keys) for codex. It just started happening on all my sessions but had been fine all day:

{ "error": { "message": "The encrypted content gAAA..._uYp could not be verified.", "type": "invalid_request_error", "param": null, "code": "invalid_encrypted_content" } }

Chargeuk avatar Dec 16 '25 18:12 Chargeuk

Same problem in 0.73.0. Codex CLI + Plus Subscription

mishiko58de avatar Dec 16 '25 18:12 mishiko58de

not sure if this is right thread to report but same thing happening on VS code plugin, using GPT plus for sign in. Tried both IDE pre-release 0.5.52 and 0.4.51. Thinking appear to go through for a few rounds then got cut off.

cguodesign avatar Dec 16 '25 19:12 cguodesign

This is probably some wider problem on their infra, lots of people hitting it just now

benapetr avatar Dec 16 '25 19:12 benapetr

same issue here in Codex

adampaulwalker avatar Dec 16 '25 19:12 adampaulwalker

Just started happening to me too, using codex cli 0.73.0 + plus subscription

oscar-urbina-tech avatar Dec 16 '25 19:12 oscar-urbina-tech

Me, too.

According to the OpenAI status page, there is currently an Incident with Codex:

https://status.openai.com/incidents/01KCM7PAMQMCM8KAB6ZCWPKNK1

So maybe this is the cause?

Image

UweKeim avatar Dec 16 '25 19:12 UweKeim

I am having this issue too. gpt-5.2 xhigh reasoning, problem is persistent both with Plus account and through regular API.

trentmkelly avatar Dec 16 '25 19:12 trentmkelly

Started to happen to me as well in Opencode v1.0.163:

The encrypted content for item rs_031ab63caf3e3289016941b2addaec819b9fff383ed03423e3 could not be verified.

I am log in via OAuth using openhax/codex plugin for OpenCode.

efrageek avatar Dec 16 '25 19:12 efrageek

just started seeing this a few minutes ago on vscode plugin. Will either fail right away or get a few steps in and then fail:

{ "error": { "message": "The encrypted content gAAA...ZHcf could not be verified.", "type": "invalid_request_error", "param": null, "code": "invalid_encrypted_content" } }

scrousenator avatar Dec 16 '25 19:12 scrousenator

I also get error using vscode codex plugin. It either fails right away or get a few steps in and then fail too:

{ "error": { "message": "The encrypted content gAAA...8gwO could not be verified.", "type": "invalid_request_error", "param": null, "code": "invalid_encrypted_content" } }

worldofcreatives avatar Dec 16 '25 19:12 worldofcreatives

same here ■ { "error": { "message": "The encrypted content gAAA...X-Y= could not be verified.", "type": "invalid_request_error", "param": null, "code": "invalid_encrypted_content" } } - However, I did seem to get it after trying to resume a previous conversation after updating to 0.73.0

BrandonKimble avatar Dec 16 '25 20:12 BrandonKimble

Thanks for reporting. We're aware of the problem, and the team is actively investigating. It's a server-side issue and not related to any particular version of the CLI or extension. I'm using #8120 to track the issue, and I'll provide updates there as they become available.

etraut-openai avatar Dec 16 '25 20:12 etraut-openai

Already working for me!

bvalerin avatar Dec 16 '25 20:12 bvalerin

same here, the issue at https://github.com/openai/codex/issues/8120 is marked as resolved, but I am still encountering the same error during use. In addition, newly opened conversations also experience the same issue. {"error":{"message":"litellm.BadRequestError: litellm.ContentPolicyViolationError: litellm.ContentPolicyViolationError: AzureException - {\n "error": {\n "message": "The encrypted content gAAA...Za4= could not be verified.",\n "type": "invalid_request_error",\n "param": null,\n "code": "invalid_encrypted_content"\n }\n}\nmodel=gpt-5.1-codex-max. content_policy_fallback=None. fallbacks=None.\n\nSet 'content_policy_fallback' - https://docs.litellm.ai/ docs/routing#fallbacks. Received Model Group=gpt-5.1-codex-max\nAvailable Model Group Fallbacks=None","type":null,"param":null,"code":"400","provider_specific_fields":{"innererror":null}}}

jlliu-sequoia avatar Dec 17 '25 02:12 jlliu-sequoia