[Bug]: First message for a github resolver issue has the token input maxed out
Is there an existing issue for the same bug?
- [x] I have checked the existing issues.
Describe the bug and reproduction steps
Why is it possible to max out my tokens on the first step of the LLM? I can't see where these tokens are coming from
OpenHands Installation
GitHub resolver
OpenHands Version
main
Operating System
None
Logs, Errors, Screenshots, and Additional Context
Run cd /tmp && python -m openhands.resolver.resolve_issue \
23:22:19 - openhands:INFO: github.py:77 - Limiting resolving to issues [1869].
23:22:19 - openhands:INFO: resolve_issue.py:427 - Using output directory: output
Cloning into 'output/repo'...
23:22:22 - openhands:INFO: resolve_issue.py:449 - Base commit: 1ca29b24c38e0c1d454c8368d5eef436711dc6f0
23:22:22 - openhands:INFO: resolve_issue.py:460 - Writing output to output/output.jsonl
23:22:22 - openhands:INFO: resolve_issue.py:475 - Resolving issue 1869 with Agent CodeActAgent, model claude-3-7-sonnet-latest, max iterations 50.
23:22:22 - openhands:INFO: resolve_issue.py:181 - Starting fixing issue 1869.
23:22:22 - openhands:INFO: docker_runtime.py:140 - [runtime 1ebc57d0-d9da-4246-a7c1-936e96b4bb82-dafd25f74a2eac71] Starting runtime with image: ghcr.io/all-hands-ai/runtime:0.27.0-nikolaik
23:23:41 - openhands:INFO: docker_runtime.py:144 - [runtime 1ebc57d0-d9da-4246-a7c1-936e96b4bb82-dafd25f74a2eac71] Container started: openhands-runtime-1ebc57d0-d9da-4246-a7c1-936e96b4bb82-dafd25f74a2eac71. VSCode URL: None
23:23:41 - openhands:INFO: docker_runtime.py:155 - [runtime 1ebc57d0-d9da-4246-a7c1-936e96b4bb82-dafd25f74a2eac71] Waiting for client to become ready at http://localhost:33974...
23:24:05 - openhands:INFO: docker_runtime.py:1[61](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:61) - [runtime 1ebc57d0-d9da-4246-a7c1-936e96b4bb82-dafd25f74a2eac71] Runtime is ready.
23:24:07 - openhands:INFO: docker_runtime.py:155 - [runtime 1ebc57d0-d9da-4246-a7c1-936e96b4bb82-dafd25f74a2eac71] Waiting for client to become ready at http://localhost:33974...
23:24:07 - openhands:INFO: docker_runtime.py:1[61](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:62) - [runtime 1ebc57d0-d9da-4246-a7c1-936e96b4bb82-dafd25f74a2eac71] Runtime is ready.
23:24:08 - openhands:INFO: resolve_issue.py:59 - ------------------------------
23:24:08 - openhands:INFO: resolve_issue.py:60 - BEGIN Runtime Completion Fn
23:24:08 - openhands:INFO: resolve_issue.py:61 - ------------------------------
23:24:08 - ACTION
**CmdRunAction (source=None, is_input=False)**
COMMAND:
cd /workspace
23:24:08 - OBSERVATION
**CmdOutputObservation (source=None, exit code=0, metadata={
"exit_code": 0,
"pid": -1,
"username": "openhands",
"hostname": "abb11d0f6ba8",
"working_dir": "/workspace",
"py_interpreter_path": "/openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python",
"prefix": "",
"suffix": "\n[The command completed with exit code 0.]"
})**
--BEGIN AGENT OBSERVATION--
[The command completed with exit code 0.]
[Current working directory: /workspace]
[Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python]
[Command finished with exit code 0]
--END AGENT OBSERVATION--
23:24:08 - ACTION
**CmdRunAction (source=None, is_input=False)**
COMMAND:
git config --global core.pager ""
23:24:09 - OBSERVATION
**CmdOutputObservation (source=None, exit code=0, metadata={
"exit_code": 0,
"pid": -1,
"username": "openhands",
"hostname": "abb11d0f6ba8",
"working_dir": "/workspace",
"py_interpreter_path": "/openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python",
"prefix": "",
"suffix": "\n[The command completed with exit code 0.]"
})**
--BEGIN AGENT OBSERVATION--
[The command completed with exit code 0.]
[Current working directory: /workspace]
[Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python]
[Command finished with exit code 0]
--END AGENT OBSERVATION--
23:24:09 - openhands:INFO: resolve_issue.py:230 - **MessageAction** (source=EventSource.USER)
CONTENT: Please fix the following issue for the repository in /workspace.
An environment has been set up for you to start working. You may assume all necessary tools are installed.
# Problem Statement
A lot of the notifyAdmins and notifyRaymond functions aren't properly HTML formatted
Recently, these two functions were converted from sending plain strings to sending html emails, but the html isn't always well formatted, can you tidy up the formatting of the htmls getting sent so that they are always fully fleshed out htmls?
Issue Thread Comments:
[OpenHands](https://github.com/All-Hands-AI/OpenHands) started fixing the issue! You can monitor the progress [here](https://github.com/modern-realty-inc/plug-chat/actions/runs/13709[62](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:63)3343).
---
[OpenHands](https://github.com/All-Hands-AI/OpenHands) started fixing the issue! You can monitor the progress [here](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493).
IMPORTANT: You should ONLY interact with the environment provided to you AND NEVER ASK FOR HUMAN HELP.
You SHOULD INCLUDE PROPER INDENTATION in your edit commands.
Some basic information about this repository:
# How to Contribute Overview
You are a brilliant software engineer.
This is a typescript vercel nextjs 14 app router deployed repository. We use AI to scale our brokerage to spend way less time on each client but still provide a better service than than current human agents via texting. We use the openai assistant to use function calling to retrieve information like zillow, redfin, the MLS, etc.
Don't create any of the frontend, just edit backend.
## Style Guide & Contributing Instructions
To install and run, use `./bun run`. Use `./bun test <filename>` to run tests. Make sure to add the filename to the test command, otherwise it will run too many tests, which have too much output and it's hard to interpret. Do NOT run `./bun test` without a filename, it will run too many tests. Try to get even more specific, run `./bun test <filename> -t <testname>` to run a specific test inside the test file, this you will get the most detailed information on the specific issue.
<IMPORTANT>
Use `./bun test <filename> -t <testname>` to test code you've written.
</IMPORTANT>
Make sure to run `./bun run build` before finishing to ensure that everything compiles. You MUST run these bun commands before proceeding with finishing the task. DO NOT END THE EXECUTION UNLESS YOU HAVE TESTED THAT THE CODE PASSES TESTS, AND RUN BUILD CHECKS. Make sure you run `./bun`, with the `./` in front. We don't have access to installing bun, and so you will need to use the local binary.
Use `lib/services/__tests__/utils/test-utils.ts` to create tests that test the real dependencies and functionalities, not mocks. From test-utils, use assistant.chat() to process test messages, don't make other items available from the test-utils return and don't import other items from tests. Check if things work only through the test-utils's assistant.chat() method.
Don't add other class initiations into test-utils. Don't make modifications to internal variables in classes for the tests, just use the assistant.chat() method. Make all classes and functions public by default unless otherwise specified. Focus on integration tests that use real components. Avoid the "unknown" or "any" types, if possible.
<IMPORTANT>
Do NOT use mocks, use real dependencies.
DO NOT USE MOCKS, USE REAL DEPENDENCIES.
</IMPORTANT>
Keep code files to under 250 lines of length if possible. Use JSDoc documentation for all functions.
After making the new branch, keep pushing to it regularly with commit messages that are good summaries of the work that is done in that commit.
Write the least amount of code possible to complete the task. Write the cleanest, shortest, most elegant code possible that follows best software engineering practices.
<IMPORTANT>
Look through the entire codebase with searching keywords to make sure that you are not duplicating any functionality.
</IMPORTANT>
For all changes to actual application code (e.g. in Python or Javascript), add an appropriate test to the testing directory to make sure that the issue has been fixed.
Run the tests, and if they pass you are done!
You do NOT need to write new tests if there are only changes to documentation or configuration files.
When you think you have fixed the issue through code changes, please call the finish action to end the interaction.
23:24:09 - openhands:INFO: agent_controller.py:466 - [Agent Controller default] Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.RUNNING
[Agent Controller default] LEVEL 0 LOCAL STEP 0 GLOBAL STEP 0
23:24:09 - openhands:INFO: resolve_issue.py:230 - AgentStateChangedObservation(content='', agent_state=<AgentState.RUNNING: 'running'>, observation='agent_state_changed')
23:24:09 - openhands:ERROR: agent_controller.py:242 - [Agent Controller default] Error while running the agent (session ID: default): litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"max_tokens: 128000 > [64](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:65)000, which is the maximum allowed number of output tokens for claude-3-7-sonnet-20250219"}}. Traceback: Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 418, in completion
response = client.post(
^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 557, in post
raise e
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 538, in post
response.raise_for_status()
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/httpx/_models.py", line 829, in raise_for_status
raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/main.py", line 1883, in completion
response = anthropic_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/llms/anthropic/chat/handler.py", line 433, in completion
raise AnthropicError(
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"invalid_request_error","message":"max_tokens: 128000 > 64000, which is the maximum allowed number of output tokens for claude-3-7-sonnet-20250219"}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/controller/agent_controller.py", line 240, in _step_with_exception_handling
await self._step()
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/controller/agent_controller.py", line 710, in _step
raise e
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/controller/agent_controller.py", line 677, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/agenthub/codeact_agent/codeact_agent.py", line 130, in step
response = self.llm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/llm/llm.py", line 242, in wrapper
resp: ModelResponse = self._completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/utils.py", line 1212, in wrapper
raise e
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/utils.py", line 1090, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/main.py", line 3090, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2210, in exception_type
raise e
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 528, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"max_tokens: 128000 > 64000, which is the maximum allowed number of output tokens for claude-3-7-sonnet-20250219"}}
23:24:09 - openhands:INFO: agent_controller.py:4[66](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:67) - [Agent Controller default] Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR
23:24:09 - openhands:ERROR: loop.py:22 - BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"max_tokens: 128000 > 64000, which is the maximum allowed number of output tokens for claude-3-7-sonnet-20250219"}}
23:24:09 - openhands:INFO: agent_controller.py:466 - [Agent Controller default] Setting agent(CodeActAgent) state from AgentState.ERROR to AgentState.ERROR
23:24:09 - openhands:INFO: resolve_issue.py:230 - AgentStateChangedObservation(content='', agent_state=<AgentState.ERROR: 'error'>, observation='agent_state_changed')
23:24:10 - openhands:INFO: resolve_issue.py:96 - ------------------------------
23:24:10 - openhands:INFO: resolve_issue.py:97 - BEGIN Runtime Completion Fn
23:24:10 - openhands:INFO: resolve_issue.py:98 - ------------------------------
23:24:10 - ACTION
**CmdRunAction (source=None, is_input=False)**
COMMAND:
cd /workspace
23:24:11 - OBSERVATION
**CmdOutputObservation (source=None, exit code=0, metadata={
"exit_code": 0,
"pid": -1,
"username": "openhands",
"hostname": "abb11d0f6ba8",
"working_dir": "/workspace",
"py_interpreter_path": "/openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python",
"prefix": "",
"suffix": "\n[The command completed with exit code 0.]"
})**
--BEGIN AGENT OBSERVATION--
[The command completed with exit code 0.]
[Current working directory: /workspace]
[Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python]
[Command finished with exit code 0]
--END AGENT OBSERVATION--
23:24:11 - ACTION
**CmdRunAction (source=None, is_input=False)**
COMMAND:
git config --global core.pager ""
23:24:11 - OBSERVATION
**CmdOutputObservation (source=None, exit code=0, metadata={
"exit_code": 0,
"pid": -1,
"username": "openhands",
"hostname": "abb11d0f6ba8",
"working_dir": "/workspace",
"py_interpreter_path": "/openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python",
"prefix": "",
"suffix": "\n[The command completed with exit code 0.]"
})**
--BEGIN AGENT OBSERVATION--
[The command completed with exit code 0.]
[Current working directory: /workspace]
[Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python]
[Command finished with exit code 0]
--END AGENT OBSERVATION--
23:24:11 - ACTION
**CmdRunAction (source=None, is_input=False)**
COMMAND:
git config --global --add safe.directory /workspace
23:24:12 - OBSERVATION
**CmdOutputObservation (source=None, exit code=0, metadata={
"exit_code": 0,
"pid": -1,
"username": "openhands",
"hostname": "abb11d0f6ba8",
"working_dir": "/workspace",
"py_interpreter_path": "/openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python",
"prefix": "",
"suffix": "\n[The command completed with exit code 0.]"
})**
--BEGIN AGENT OBSERVATION--
[The command completed with exit code 0.]
[Current working directory: /workspace]
[Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python]
[Command finished with exit code 0]
--END AGENT OBSERVATION--
23:24:12 - ACTION
**CmdRunAction (source=None, is_input=False)**
COMMAND:
git add -A
23:24:13 - OBSERVATION
**CmdOutputObservation (source=None, exit code=0, metadata={
"exit_code": 0,
"pid": -1,
"username": "openhands",
"hostname": "abb11d0f6ba8",
"working_dir": "/workspace",
"py_interpreter_path": "/openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python",
"prefix": "",
"suffix": "\n[The command completed with exit code 0.]"
})**
--BEGIN AGENT OBSERVATION--
[The command completed with exit code 0.]
[Current working directory: /workspace]
[Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python]
[Command finished with exit code 0]
--END AGENT OBSERVATION--
23:24:13 - ACTION
**CmdRunAction (source=None, is_input=False)**
COMMAND:
git diff --no-color --cached 1ca29b24c38e0c1d454c8368d5eef43[67](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:68)11dc6f0
23:24:13 - OBSERVATION
**CmdOutputObservation (source=None, exit code=0, metadata={
"exit_code": 0,
"pid": -1,
"username": "openhands",
"hostname": "abb11d0f6ba8",
"working_dir": "/workspace",
"py_interpreter_path": "/openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python",
"prefix": "",
"suffix": "\n[The command completed with exit code 0.]"
})**
--BEGIN AGENT OBSERVATION--
[The command completed with exit code 0.]
[Current working directory: /workspace]
[Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.12/bin/python]
[Command finished with exit code 0]
--END AGENT OBSERVATION--
23:24:13 - openhands:INFO: resolve_issue.py:157 - ------------------------------
23:24:13 - openhands:INFO: resolve_issue.py:158 - END Runtime Completion Fn
23:24:13 - openhands:INFO: resolve_issue.py:159 - ------------------------------
23:24:13 - openhands:INFO: resolve_issue.py:259 - Got git diff for instance 1869:
--------
--------
23:24:13 - openhands:INFO: resolve_issue.py:528 - Finished.
ERROR:root: File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/resolver/resolve_issue.py", line 707, in <module>
main()
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/resolver/resolve_issue.py", line [68](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:69)6, in main
asyncio.run(
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/asyncio/runners.py", line 195, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/asyncio/base_events.py", line [69](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:70)1, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/resolver/resolve_issue.py", line 510, in resolve_issue
output = await process_issue(
^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/resolver/resolve_issue.py", line 275, in process_issue
success, comment_success, result_explanation = issue_handler.guess_success(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/resolver/interfaces/issue_definitions.py", line 389, in guess_success
response = self.llm.completion(messages=[{'role': 'user', 'content': prompt}])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 4[75](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:76), in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 3[76](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:77), in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/tenacity/__init__.py", line 4[78](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:79), in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/openhands/llm/llm.py", line 242, in wrapper
resp: ModelResponse = self._completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/utils.py", line 1212, in wrapper
raise e
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/utils.py", line 1090, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/main.py", line 3090, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2210, in exception_type
raise e
File "/opt/hostedtoolcache/Python/3.12.9/x64/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 528, in exception_type
raise BadRequestError(
ERROR:root:<class 'litellm.exceptions.BadRequestError'>: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"max_tokens: 12[80](https://github.com/modern-realty-inc/plug-chat/actions/runs/13710152493/job/38344754826#step:10:81)00 > 64000, which is the maximum allowed number of output tokens for claude-3-7-sonnet-20250219"}}
This was a strange one coming from Anthropic API and litellm adapting too fast, and it's already hot fixed in main, it should be in 0.28!
- https://github.com/All-Hands-AI/OpenHands/pull/7140
the issue still seems to be present on resolver even when using this runtime image which has the merged code: ghcr.io/all-hands-ai/runtime:7140-merge-nikolaik
Was this resolved with the latest release?
Was this resolved with the latest release?
I did not encounter the bug anymore with 0.28
Yep this was fixed. Closing.