claude-code icon indicating copy to clipboard operation
claude-code copied to clipboard

Feature Request: Enable 1M Context Window for Claude Code

Open coygeek opened this issue 4 months ago • 42 comments

Title: Feature Request: Enable 1M Context Window for Claude Code

Body

Thank you for the excellent news about the 1M token context window being available on the API. This is a welcome and powerful change.

We would like to formally request that this 1M context window capability be enabled for the claude code CLI tool.

The ability to load an entire codebase into context would be a game-changer for claude code, transforming it into an essential tool for large-scale analysis, architecture questions, and complex refactoring tasks.

Please make this feature available for claude code.

coygeek avatar Aug 12 '25 23:08 coygeek

Claude Code v1.0.80

> /model sonnet[1m]
  ⎿  Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])

beala avatar Aug 14 '25 00:08 beala

I tested this and saw:

> /model 
  ⎿  Kept model as sonnet[1m]

> read google-gemini-gemini-cli-8a5edab282632443.txt and give an executive summary of the architecture
  ⎿ API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"The long context beta 
    is not yet available for this subscription."}}

This means this new (currently undocumented) model must be used with your own API key. Overall, good find, but not exactly what we want.

coygeek avatar Aug 14 '25 00:08 coygeek

Found 2 possible duplicate issues:

  1. https://github.com/anthropics/claude-code/issues/5628
  2. https://github.com/anthropics/claude-code/issues/5832

This issue will be automatically closed as a duplicate in 3 days.

  • If your issue is a duplicate, please close it and 👍 the existing issue instead
  • To prevent auto-closure, add a comment or 👎 this comment

🤖 Generated with Claude Code

github-actions[bot] avatar Aug 15 '25 21:08 github-actions[bot]

Claude Code v1.0.80

> /model sonnet[1m]
  ⎿  Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])

@beala

my claude code does not have this model, how do you make it to use?

junmediatek avatar Aug 18 '25 09:08 junmediatek

Claude Code v1.0.80

> /model sonnet[1m]
  ⎿  Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])

@beala

my claude code does not have this model, how do you make it to use?

Either wait to gain access (they will probably send you an email once you have it), or use it over API. As I've seen on reddit, this may potentially be limited to 20x users for now.

OpenSource03 avatar Aug 20 '25 11:08 OpenSource03

Any model name for the sonnet + plan 1m ?

nullswan avatar Aug 20 '25 16:08 nullswan

I am 20x claude max, but I still have no access to sonnet with 1M context.

> /model sonnet[1m]
  ⎿  Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])

> hi
  ⎿  API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"The long context beta is not yet available for this 
     subscription."},"request_id":"req_........"}

0x10sh avatar Aug 21 '25 07:08 0x10sh

Cursor now supports Claude 4 Sonnet with 1M tokens of context! Max Mode only! https://x.com/leerob/status/1959009436485255450

Not available in CC yet.

coygeek avatar Aug 23 '25 20:08 coygeek

lol, cc max user here, still not have access to 1M sonnet

moinulmoin avatar Aug 25 '25 11:08 moinulmoin

Confirmed to be on the roadmap: https://x.com/alexalbert__/status/1960459659556585667

coygeek avatar Aug 27 '25 18:08 coygeek

Really excited about this — will impact our use case a lot.

lukalotl avatar Aug 27 '25 21:08 lukalotl

1M context length model should be available for 5x.

Selimonder avatar Sep 11 '25 19:09 Selimonder

I have 20x subscription and still not be able to use 1M context

silver-epsilo avatar Sep 12 '25 03:09 silver-epsilo

Still not support for 20x subscription...

/model sonnet[1m] ⎿ Set model to sonnet[1m] (claude-sonnet-4-20250514[1m])

hello ⎿ API Error: 400 {"type":"error","error":{"type":"invalid_request_error","message":"The long context beta is not yet available for this subscription."},"request_id":"req_011CTSzdsakiZi58ubdKmfDY"}

/model ⎿ Set model to Default (Opus 4.1 for up to 50% of usage limits, then use Sonnet 4)

silver-epsilo avatar Sep 24 '25 09:09 silver-epsilo

I am disappointed to see that despite paying $200/m Max users still don't have access to the 1m context window. I'd be perfectly fine with it consuming 2x of my quota once over 200k tokens. Not having the option at all without switching to the pay-as-you-go API route is very irritating and at that point I'd be better off cancelling my Max subscription in favor of the API. Dunno if it's even possible to use Claude Code with a mix of API & subscription, so options seem kinda limited for subscription users...

Sceleratis avatar Sep 29 '25 22:09 Sceleratis

Dunno if it's even possible to use Claude Code with a mix of API & subscription, so options seem kinda limited for subscription users...

A possible workflow could be: -Use your Max subscription for all your day-to-day coding, since that's a fixed cost. -When you have a specific, massive task that needs the 1M context window, use the /logout command, then /login again and choose your Console (API) account. -Run that specific task, paying for the API usage. -Once you're done, you can /logout and log back in with your Max subscription account.


We (myself included) don't have access to Sonnet 1M with our subscriptions, so I agree there.

coygeek avatar Sep 30 '25 00:09 coygeek

Official update from Anthropic:

We are currently limiting access "in public beta on the Claude Developer Platform for customers with Tier 4 and custom rate limits" as well as some Claude Max 20x users.

https://github.com/anthropics/claude-code/issues/8381#issuecomment-3349286446

coygeek avatar Sep 30 '25 02:09 coygeek

Even though I have access to sonnet 1m models via bedrock, claude-code can't go beyond 200k context. I have tried /model sonnet[1m], or /model global...sonnet[1m], and claude code shows 1M Context text (see image), context is limited to 200k. In v2.0.x, auto compact isn't working anymore.

Image

abdul avatar Oct 08 '25 15:10 abdul

I have a max x20 subscription, but as of mid-October, I still don't have access to the 1 million context, even though it's critically important to me right now. I constantly encounter problems at work because of this and am ready to switch to gemini cli.

LeoWebMarketing avatar Oct 13 '25 18:10 LeoWebMarketing

As an x20 tier subscriber, I'd like to add my voice to this request while highlighting why this feature represents both a technical necessity and a strategic opportunity for Anthropic.

The current 200K token limit forces us into an artificial compartmentalization that mirrors the very silos we're trying to eliminate in modern software architecture. While Claude's reasoning capabilities surpass competitors, the context limitation creates an artificial ceiling on its practical utility. Engineers are pragmatists - we'll use the tool that handles our entire problem space, even if it's marginally less capable in pure reasoning. By withholding 1M context from Claude Code while offering it via API, we're forced into a suboptimal bifurcation of our toolchain.

Claude Code with 1M context would be the only terminal-based AI coding assistant capable of handling enterprise-scale codebases in a single context. This creates a defensible moat - once teams restructure their workflows around this capability, switching costs become prohibitive. GitHub Copilot, Cursor, and others are constrained to fragmented, file-by-file assistance. Claude Code could own the "whole codebase comprehension" category.

markcrandall avatar Oct 14 '25 01:10 markcrandall

I have a personal Max subscription, and our company has a Team subscription with 6 premier licenses. Going from 200K to 1M is a huge jump. To be honets, if Claude does this well with 200K context, it would be nice to have increemental jumps. Not just 200K -> 1M, but 200K -> 400K -> 600K, etc. I surmise that with 400K context, CC could do much better, and then 600K, etc. But it likely isn't always necessary to jump to 1M. Just some thoughts.

But, of course having 1M context would allow for introspection and refactoring, etc. of larger code bases in fewer steps. My assumption is that with a large code base, CC, reads part into context, summizes, stores to memory (??), reads more, etc. And seems to kind of iterate over all of the code. A larger context would allow CC to do this on more code at a time, reducing the number of such loops required.

I may also be out in left field in my understading of how CC works. I'm just looking forward to the next bit jump in performance/capability of CC

kutenai avatar Oct 19 '25 02:10 kutenai

WHY DO WE MAX USERS STILL NOT HAVE THIS! It's insanity!

advenimus avatar Oct 22 '25 18:10 advenimus

Please release for Max users. Had this at my previous job and it was critical for complex tasks.

juanpprieto avatar Nov 02 '25 01:11 juanpprieto

vote

TinDang97 avatar Nov 03 '25 05:11 TinDang97

How can we get the team's attention? Tweet? Or LI post? Its been 3 months and none of the subscription plans got the 1M context length?

itsmarathon avatar Nov 05 '25 15:11 itsmarathon

Would love to have 1M context on Claude Web too for projects!

christianvuye avatar Nov 07 '25 05:11 christianvuye

+1 of 1M context for subscription users

SebastianAtom avatar Nov 08 '25 15:11 SebastianAtom

This should really be out by now. 200k makes it hard to really do anything complex, I have to compact the chat every 5 minutes. And since Gemini 3 is coming out this month Antrophic is really going to lag behind.

Grinsven avatar Nov 14 '25 16:11 Grinsven

Also for general web conversations too. It would be great.

OpenSource03 avatar Nov 17 '25 10:11 OpenSource03

I think Claude Code lost a lot of user to Cursor just because of 1M 😬

Selimonder avatar Nov 19 '25 23:11 Selimonder