onlook
onlook copied to clipboard
fix: prevent chat conversations from exceeding token context limit
Description
Fixed chat conversations exceeding the LLM token context limit (200,000) when accumulating too many messages.
- Implemented conversation summarization when approaching token limits:
- Generates concise technical summaries of previous conversations
Related Issues
Type of Change
- [x] Bug fix
- [ ] New feature
- [ ] Documentation update
- [ ] Release
- [ ] Other (please describe):