Feature Request: Enable 1M Context Window for Claude Code
Title: Feature Request: Enable 1M Context Window for Claude Code
Body
Thank you for the excellent news about the 1M token context window being available on the API. This is a welcome and powerful change.
We would like to formally request that this 1M context window capability be enabled for the claude code CLI tool.
The ability to load an entire codebase into context would be a game-changer for claude code, transforming it into an essential tool for large-scale analysis, architecture questions, and complex refactoring tasks.
Please make this feature available for claude code.
Claude Code v1.0.80
> /model sonnet[1m]
⎿ Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])
I tested this and saw:
> /model
⎿ Kept model as sonnet[1m]
> read google-gemini-gemini-cli-8a5edab282632443.txt and give an executive summary of the architecture
⎿ API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"The long context beta
is not yet available for this subscription."}}
This means this new (currently undocumented) model must be used with your own API key. Overall, good find, but not exactly what we want.
Found 2 possible duplicate issues:
- https://github.com/anthropics/claude-code/issues/5628
- https://github.com/anthropics/claude-code/issues/5832
This issue will be automatically closed as a duplicate in 3 days.
- If your issue is a duplicate, please close it and 👍 the existing issue instead
- To prevent auto-closure, add a comment or 👎 this comment
🤖 Generated with Claude Code
Claude Code v1.0.80
> /model sonnet[1m] ⎿ Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])
@beala
my claude code does not have this model, how do you make it to use?
Claude Code v1.0.80
> /model sonnet[1m] ⎿ Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])my claude code does not have this model, how do you make it to use?
Either wait to gain access (they will probably send you an email once you have it), or use it over API. As I've seen on reddit, this may potentially be limited to 20x users for now.
Any model name for the sonnet + plan 1m ?
I am 20x claude max, but I still have no access to sonnet with 1M context.
> /model sonnet[1m]
⎿ Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])
> hi
⎿ API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"The long context beta is not yet available for this
subscription."},"request_id":"req_........"}
Cursor now supports Claude 4 Sonnet with 1M tokens of context! Max Mode only! https://x.com/leerob/status/1959009436485255450
Not available in CC yet.
lol, cc max user here, still not have access to 1M sonnet
Confirmed to be on the roadmap: https://x.com/alexalbert__/status/1960459659556585667
Really excited about this — will impact our use case a lot.
1M context length model should be available for 5x.
I have 20x subscription and still not be able to use 1M context
Still not support for 20x subscription...
/model sonnet[1m] ⎿ Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])
hello ⎿ API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"The long context beta is not yet available for this subscription."},"request_id":"req_011CTSzdsakiZi58ubdKmfDY"}
/model ⎿ Set model to Default (Opus 4.1 for up to 50% of usage limits, then use Sonnet 4)
I am disappointed to see that despite paying $200/m Max users still don't have access to the 1m context window. I'd be perfectly fine with it consuming 2x of my quota once over 200k tokens. Not having the option at all without switching to the pay-as-you-go API route is very irritating and at that point I'd be better off cancelling my Max subscription in favor of the API. Dunno if it's even possible to use Claude Code with a mix of API & subscription, so options seem kinda limited for subscription users...
Dunno if it's even possible to use Claude Code with a mix of API & subscription, so options seem kinda limited for subscription users...
A possible workflow could be: -Use your Max subscription for all your day-to-day coding, since that's a fixed cost. -When you have a specific, massive task that needs the 1M context window, use the /logout command, then /login again and choose your Console (API) account. -Run that specific task, paying for the API usage. -Once you're done, you can /logout and log back in with your Max subscription account.
We (myself included) don't have access to Sonnet 1M with our subscriptions, so I agree there.
Official update from Anthropic:
We are currently limiting access "in public beta on the Claude Developer Platform for customers with Tier 4 and custom rate limits" as well as some Claude Max 20x users.
https://github.com/anthropics/claude-code/issues/8381#issuecomment-3349286446
Even though I have access to sonnet 1m models via bedrock, claude-code can't go beyond 200k context. I have tried /model sonnet[1m], or /model global...sonnet[1m], and claude code shows 1M Context text (see image), context is limited to 200k. In v2.0.x, auto compact isn't working anymore.
I have a max x20 subscription, but as of mid-October, I still don't have access to the 1 million context, even though it's critically important to me right now. I constantly encounter problems at work because of this and am ready to switch to gemini cli.
As an x20 tier subscriber, I'd like to add my voice to this request while highlighting why this feature represents both a technical necessity and a strategic opportunity for Anthropic.
The current 200K token limit forces us into an artificial compartmentalization that mirrors the very silos we're trying to eliminate in modern software architecture. While Claude's reasoning capabilities surpass competitors, the context limitation creates an artificial ceiling on its practical utility. Engineers are pragmatists - we'll use the tool that handles our entire problem space, even if it's marginally less capable in pure reasoning. By withholding 1M context from Claude Code while offering it via API, we're forced into a suboptimal bifurcation of our toolchain.
Claude Code with 1M context would be the only terminal-based AI coding assistant capable of handling enterprise-scale codebases in a single context. This creates a defensible moat - once teams restructure their workflows around this capability, switching costs become prohibitive. GitHub Copilot, Cursor, and others are constrained to fragmented, file-by-file assistance. Claude Code could own the "whole codebase comprehension" category.
I have a personal Max subscription, and our company has a Team subscription with 6 premier licenses. Going from 200K to 1M is a huge jump. To be honets, if Claude does this well with 200K context, it would be nice to have increemental jumps. Not just 200K -> 1M, but 200K -> 400K -> 600K, etc. I surmise that with 400K context, CC could do much better, and then 600K, etc. But it likely isn't always necessary to jump to 1M. Just some thoughts.
But, of course having 1M context would allow for introspection and refactoring, etc. of larger code bases in fewer steps. My assumption is that with a large code base, CC, reads part into context, summizes, stores to memory (??), reads more, etc. And seems to kind of iterate over all of the code. A larger context would allow CC to do this on more code at a time, reducing the number of such loops required.
I may also be out in left field in my understading of how CC works. I'm just looking forward to the next bit jump in performance/capability of CC
WHY DO WE MAX USERS STILL NOT HAVE THIS! It's insanity!
Please release for Max users. Had this at my previous job and it was critical for complex tasks.
vote
How can we get the team's attention? Tweet? Or LI post? Its been 3 months and none of the subscription plans got the 1M context length?
Would love to have 1M context on Claude Web too for projects!
+1 of 1M context for subscription users
This should really be out by now. 200k makes it hard to really do anything complex, I have to compact the chat every 5 minutes. And since Gemini 3 is coming out this month Antrophic is really going to lag behind.
Also for general web conversations too. It would be great.
I think Claude Code lost a lot of user to Cursor just because of 1M 😬