Auto-compact when approaching context limit
What feature would you like to see?
Auto-compact when approaching context limit
Are you interested in implementing this feature?
No response
Additional information
No response
Are there any body working on this feature? I'm willing to implement this if needed
This is indeed very desired, especially with the gpt-oss models, which have default context length on ollama set to 8024 tokens.
For this to be implemented, this issue https://github.com/openai/codex/issues/2927 should probably be fixed first. Otherwise it would become "auto-delete" AGENTS.md instead of auto-compact.
was auto-compact implemented?
not yet
So people are paying money to access a coding system, but CODEX inevitably burns out and simply cannot continue working without a dramatic change in workflow that makes the tool more frustrating than useful. I'm not able to ask it to summarize itself so that it can fit into context length because it's at context length. So, there is effectively no point in continuing the project using CODEX.
In other words, your system, which people paid for, simply stops working at a certain point. It becomes useless.
I argue this a problem, not a desired feature or request for enhancement.
So people are paying money to access a coding system, but CODEX inevitably burns out and simply cannot continue working without a dramatic change in workflow that makes the tool more frustrating than useful. I'm not able to ask it to summarize itself so that it can fit into context length because it's at context length. So, there is effectively no point in continuing the project using CODEX.
In other words, your system, which people paid for, simply stops working at a certain point. It becomes useless.
I argue this a problem, not a desired feature or request for enhancement.
Is this comment made with the new auto-compaction taken into account?
So people are paying money to access a coding system, but CODEX inevitably burns out and simply cannot continue working without a dramatic change in workflow that makes the tool more frustrating than useful. I'm not able to ask it to summarize itself so that it can fit into context length because it's at context length. So, there is effectively no point in continuing the project using CODEX. In other words, your system, which people paid for, simply stops working at a certain point. It becomes useless. I argue this a problem, not a desired feature or request for enhancement.
Is this comment made with the new auto-compaction taken into account?
It is not, no. I didn't see the open PR for auto-compact that was made this week, but it's still odd to me that this is marked as "enhancement".
We've enabled auto compact at 90% of the context window.