goose
goose copied to clipboard
Old models with small context windows get into a compaction doom loop
try claude-3-opus-20240229
yeah, so the problem is that we immediately run out of context and then just keep compacting. this model has a context limit of 4K and our minimum message size is 8k
supporting a 4k model is crazy, but 8k, we should be able to do I would think
so really what we need to do here is block models that have a context window that is too small