aider
aider copied to clipboard
Complex queries with Anthropic Opus lead to overloading the API, resulting in an exception and the program being terminated
Issue
Hello,
I've been encountering a recurring issue with aider when executing complex queries that utilize Anthropic Opus. It appears that these queries are overloading the API, which results in exceptions being thrown and the termination of the program.
Upon checking the Anthropic API logs, I noticed the following output that suggests an overload condition:
claude-3-opus-20240229
MODEL LATENCY | INPUT TOKENS | OUTPUT TOKENS | TYPE | ERROR |
---|---|---|---|---|
38.44 | 7602 | 1135 | sse | {"client_error":false,"code":529,"detail":"Overloaded"} |
Following this, aider throws an exception as detailed below:
Traceback (most recent call last):
File "C:\Users\Anonymous\pipx\venvs\aider-chat\Lib\site-packages\litellm\utils.py", line 9665, in chunk_creator
response_obj = self.handle_anthropic_chunk(chunk)
...
ValueError: Unable to parse response. Original response: event: error
...
litellm.exceptions.ServiceUnavailableError: AnthropicException - Unable to parse response. Original response: event: error
Potential Solutions:
- Improving the backoff strategy or rate limiting for API requests might help manage the load more effectively.
- It may also be beneficial to catch these specific overload exceptions and handle them gracefully within aider, possibly by retrying the request after a delay.
- Another approach could be to optimize the queries sent to Anthropic Opus to reduce their complexity or split them into smaller, more manageable parts.
I hope this information is helpful for diagnosing and addressing the issue. I look forward to any suggestions or updates on this matter.
Version and model info
Aider v0.30.1
Models: claude-3-opus-20240229
with diff edit format, weak model claude-3-haiku-20240307
Git repo: .git with 166 files
Repo-map: using 1024 tokens
Thanks for trying aider and filing this issue.
It's unlikely that aider is overloading Anthropic's servers. That error is most likely because they are having capacity problems handling their overall traffic volume.
Aider does exponential backoff retries for these sorts of errors. I've added litellm.exceptions.ServiceUnavailableError
to the list of errors it will retry automatically.
The change is available in the main branch. You can get it by installing the latest version from github:
python -m pip install --upgrade git+https://github.com/paul-gauthier/aider.git
If you have a chance to try it, let me know if it works better for you.
I'm going to close this issue for now, but feel free to add a comment here and I will re-open or file a new issue any time.
I'm seeing a similar error. It actually occurred in the middle of a response, interrupting the diff process, which may be interesting debugging info.
I have not seen such errors accessing the Anthropic API with my own tests, though I haven't run many.
It would be valuable to have clearer feedback if it seems this is an Anthropic error.
<<<<<<< SEARCH
const expected: Patch = {
pageIds: [],
sectionIds: [],
joplinNoteIds: ["97dc971b686a47878401fe8dabaa6a1c"],
joplinFolderIds: ["aa4937d860dd455c893561a1c39e5743"],
contentHtml: `${P}> Here is an indubitably memorable quote!<br /></p>${P}@@journal .z.p1 something</p>${P}---</p>${P}<em>Source: A Goo
Book</em></p>`,
date,
};
=======
const expected: Patch = {
joplinNoteIds: ["97dc971b686a47878401fe8dabaa6a1c"],
joplinFolder
Traceback (most recent call last):
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 9965, in chunk_creator
response_obj = self.handle_anthropic_chunk(chunk)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 9274, in handle_anthropic_chunk
raise ValueError(f"Unable to parse response. Original response: {str_line}")
ValueError: Unable to parse response. Original response: event: error
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 10536, in __next__
response: Optional[ModelResponse] = self.chunk_creator(chunk=chunk)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 10467, in chunk_creator
raise exception_type(
^^^^^^^^^^^^^^^
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 8965, in exception_type
raise e
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 8940, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: Unable to parse response. Original response: event: error
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/me/.local/bin/aider", line 8, in <module>
sys.exit(main())
^^^^^^
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/aider/main.py", line 402, in main
coder.run()
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/aider/coders/base_coder.py", line 473, in run
list(self.send_new_user_message(new_user_message))
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/aider/coders/base_coder.py", line 611, in send_new_user_message
yield from self.send(messages, functions=self.functions)
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/aider/coders/base_coder.py", line 740, in send
yield from self.show_send_output_stream(completion)
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/aider/coders/base_coder.py", line 824, in show_send_output_stream
for chunk in completion:
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 10573, in __next__
raise exception_type(
^^^^^^^^^^^^^^^
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 8965, in exception_type
raise e
File "/Users/me/.local/pipx/venvs/aider-chat/lib/python3.12/site-packages/litellm/utils.py", line 7878, in exception_type
raise ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: AnthropicException - Unable to parse response. Original response: event: error
I am getting overloaded error, is it on their part or something i can change to fox the issue? im using claude opus 3.6
An error occurred: Error code: 529 - {'type': 'error', 'error': {'type': 'overloaded_error', 'message': 'Overloaded'}}
same issue :(
These errors are because Anthropic's API servers are overloaded/down. Aider is simply reporting that the server is broken.
Anthropic has been having serious issues since yesterday. You can directly check their server status here:
https://status.anthropic.com/
Happens last days even locally in my own dev branch with the API, as I found Aider got confused too much. Claude is great if it works, but now I had to resort to OpenAI coding and Claude in the app but still the 529 is popping up there also. Seems to be an issue like the US markets opening up. And all the bots start analyzing and or trading hahah