OpenHands
OpenHands copied to clipboard
fix(agenthub) : limit context size send to llm api
What?
When observation becoming bigger like more than 16385 words like when developing environment, the LLM API (in my case chatGPT-3.5) throws error like below image.
I think If we can limit the context size send to LLM api, we don't have to worry about the above 400 error. I did't try using Gemini or Vertex LLM, and even for GPT I didn't try using GPT4, so maybe there are improvement to have conditional branch that supports each LLM.
Thank you for reading my PR.
should we dense the messages or swap a portion to long-term memory instead of truncating them directly?
@xingyaoww I vaguely remember you doing something like this already
@coffeecupjapan we should at least extract out the magic constant, so we can tune this a bit
See also: https://github.com/OpenDevin/OpenDevin/pull/1421
Codecov Report
Attention: Patch coverage is 0%
with 1 lines
in your changes are missing coverage. Please review.
:exclamation: No coverage uploaded for pull request base (
main@31c1a2d
). Click here to learn what that means.
Files | Patch % | Lines |
---|---|---|
agenthub/codeact_agent/codeact_agent.py | 0.00% | 1 Missing :warning: |
:exclamation: Your organization needs to install the Codecov GitHub app to enable full functionality.
Additional details and impacted files
@@ Coverage Diff @@
## main #1485 +/- ##
=======================================
Coverage ? 58.52%
=======================================
Files ? 83
Lines ? 3479
Branches ? 0
=======================================
Hits ? 2036
Misses ? 1443
Partials ? 0
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@rbren Yes! We already have this in-place in https://github.com/OpenDevin/OpenDevin/pull/1494
https://github.com/OpenDevin/OpenDevin/blob/24750ba04f0da3fe7679e7bac1ed25b480198c7a/agenthub/codeact_agent/codeact_agent.py#L37-L44
@coffeecupjapan I'm going to close this one, but feel free to open up an issue where we can discuss a more long-term solution!