Aider BUG (Trying to use Openai API all the time)
Issue
I user VertexAI api and when using Aider, I get all the time this problem that makes the software unusable:
"These steps will set up Git for your project and create an initial commit with all your files, excluding those specified in the .gitignore file.
Is there anything specific you'd like to do next with your Git setup or project?
Tokens: 44,461 sent, 315 received. Cost: $0.14 request, $2.25 session. Committing .gitignore before applying edits. Exception while updating files: litellm.AuthenticationError: AuthenticationError: OpenAIException - Traceback (most recent call last): File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 1032, in completion raise e File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 927, in completion openai_client = self._get_openai_client( File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 743, in _get_openai_client _new_client = openai( File "/home/soporte/.local/lib/python3.10/site-packages/openai/_client.py", line 105, in init raise openaiError( openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
Traceback (most recent call last): File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 1032, in completion raise e File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 927, in completion openai_client = self._get_openai_client( File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 743, in _get_openai_client _new_client = OpenAI( File "/home/soporte/.local/lib/python3.10/site-packages/openai/_client.py", line 105, in init raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/soporte/.local/lib/python3.10/site-packages/litellm/main.py", line 1371, in completion raise e File "/home/soporte/.local/lib/python3.10/site-packages/litellm/main.py", line 1344, in completion response = openai_chat_completions.completion( File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 1040, in completion raise OpenAIError(status_code=500, message=traceback.format_exc()) litellm.llms.openai.OpenAIError: Traceback (most recent call last): File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 1032, in completion raise e File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 927, in completion openai_client = self._get_openai_client( File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 743, in _get_openai_client _new_client = OpenAI( File "/home/soporte/.local/lib/python3.10/site-packages/openai/_client.py", line 105, in init raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/home/soporte/.local/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1496, in apply_updates edited = self.update_files() File "/home/soporte/.local/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1490, in update_files edits = self.prepare_to_edit(edits) File "/home/soporte/.local/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1483, in prepare_to_edit self.dirty_commit() File "/home/soporte/.local/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1599, in dirty_commit self.repo.commit(fnames=self.need_commit_before_edits) File "/home/soporte/.local/lib/python3.10/site-packages/aider/repo.py", line 99, in commit commit_message = self.get_commit_message(diffs, context) File "/home/soporte/.local/lib/python3.10/site-packages/aider/repo.py", line 177, in get_commit_message commit_message = simple_send_with_retries(model.name, messages) File "/home/soporte/.local/lib/python3.10/site-packages/aider/sendchat.py", line 113, in simple_send_with_retries _hash, response = send_with_retries( File "/home/soporte/.local/lib/python3.10/site-packages/aider/sendchat.py", line 69, in wrapper return decorated_func(*args, **kwargs) File "/home/soporte/.local/lib/python3.10/site-packages/backoff/_sync.py", line 105, in retry ret = target(*args, **kwargs) File "/home/soporte/.local/lib/python3.10/site-packages/aider/sendchat.py", line 103, in send_with_retries res = litellm.completion(**kwargs) File "/home/soporte/.local/lib/python3.10/site-packages/litellm/utils.py", line 1028, in wrapper raise e File "/home/soporte/.local/lib/python3.10/site-packages/litellm/utils.py", line 908, in wrapper result = original_function(*args, **kwargs) File "/home/soporte/.local/lib/python3.10/site-packages/litellm/main.py", line 2760, in completion raise exception_type( File "/home/soporte/.local/lib/python3.10/site-packages/litellm/utils.py", line 8139, in exception_type raise e File "/home/soporte/.local/lib/python3.10/site-packages/litellm/utils.py", line 6459, in exception_type raise AuthenticationError( litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - Traceback (most recent call last): File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 1032, in completion raise e File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 927, in completion openai_client = self._get_openai_client( File "/home/soporte/.local/lib/python3.10/site-packages/litellm/llms/openai.py", line 743, in _get_openai_client _new_client = openai( File "/home/soporte/.local/lib/python3.10/site-packages/openai/_client.py", line 105, in init raise openaiError( openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable"
Can someone help? I try to use Aider, but when working with it, it all the times throw me these errors.
Version and model info
Aider v0.48.1 Models: vertex_ai/claude-3-5-sonnet@20240620 with diff edit format, weak model vertex_ai/claude-3-haiku@20240307 Git repo: .git with 10 files Repo-map: using 1024 tokens
Thanks for trying aider and filing this issue.
Have you placed a valid key in the OPENAI_API_KEY environment variable?
Why should I do that if I am not using OpenAI? I use VertexAI Claude3.5 sonnet, and I already set up the project id and location. I can run Aider with VertexAI, but when it tries to edit a file or create a file, I get those kind of errors.
El 9 ago 2024, a las 14:59, paul-gauthier @.***> escribió:
Thanks for trying aider and filing this issue.
Have you placed a valid key in the OPENAI_API_KEY environment variable?
— Reply to this email directly, view it on GitHub https://github.com/paul-gauthier/aider/issues/1044#issuecomment-2277891679, or unsubscribe https://github.com/notifications/unsubscribe-auth/BKM6R5QYZCKNYKJZ3POVNCLZQS4M7AVCNFSM6AAAAABMIMNLUSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENZXHA4TCNRXHE. You are receiving this because you authored the thread.
Can you share all the announce lines that aider prints when you launch it?
@xavitg
Do you have your .aider.conf.yml file created wtih
model: vertex_ai/claude-3-5-sonnet@20240620
After that, you have to run Aider using
aider --config .aider.conf.yml
It works for me no problem
Hello,
So I have to edit manually that file and then run that command in order to get it working properly?
Thank you
El 9 ago 2024, a las 18:20, sholub89 @.***> escribió:
@xavitg https://github.com/xavitg Do you have your .aider.conf.yml file created wtih
model: @.*** After that, you have to run Aider using aider --config .aider.conf.yml
It works for me no problem
— Reply to this email directly, view it on GitHub https://github.com/paul-gauthier/aider/issues/1044#issuecomment-2278307467, or unsubscribe https://github.com/notifications/unsubscribe-auth/BKM6R5SJ3LOE2HMMKLXCZRTZQTT6RAVCNFSM6AAAAABMIMNLUSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENZYGMYDONBWG4. You are receiving this because you were mentioned.
@xavitg , maybe there is a better way, but this is how I have it setup and it works perfectly. Please give it a try
@xavitg Can you share all the announce lines that aider prints when you launch it? And the command line with args that you use to run it?
Like this:
$ aider
Aider v0.48.2-dev
Models: claude-3-5-sonnet-20240620 with diff edit format, weak model
claude-3-haiku-20240307
Git repo: .git with 303 files
Repo-map: using 1024 tokens
Use /help <question> for help, run "aider --help" to see cmd line args
─────────────────────────────────────────────────────────────────────────────
>
Like this:
$ aider Aider v0.48.2-dev Models: claude-3-5-sonnet-20240620 with diff edit format, weak model claude-3-haiku-20240307 Git repo: .git with 303 files Repo-map: using 1024 tokens Use /help <question> for help, run "aider --help" to see cmd line args ───────────────────────────────────────────────────────────────────────────── >
soporte@cloudshell:~/BI-Improved (mindful-p)$ aider Model gpt-4o: Missing these environment variables:
- OPENAI_API_KEY For more info, see: https://aider.chat/docs/llms/warnings.html
Model gpt-4o-mini: Missing these environment variables:
- OPENAI_API_KEY For more info, see: https://aider.chat/docs/llms/warnings.html
Aider v0.48.1
Models: gpt-4o with diff edit format, weak model gpt-4o-mini
Git repo: .git with 11 files
Repo-map: using 1024 tokens
VSCode terminal detected, pretty output has been disabled.
Use /help
But then, I use: /model vertex_ai/claude-3-5-sonnet@20240620 then I see this:
/model vertex_ai/claude-3-5-sonnet@20240620
Aider v0.48.1 Models: vertex_ai/claude-3-5-sonnet@20240620 with diff edit format, weak model vertex_ai/claude-3-haiku@20240307 Git repo: .git with 11 files Repo-map: using 1024 tokens ──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
aider --config .aider.conf.yml
I am not able to find this. I installed aider with pip install aider in my root directory without any env. Its installed here: ~/.local/lib/python3.10/site-packages
But I am not able to find that file, even using the command: find ~ -name ".aider.conf.yml"
@xavitg , this is the file you need to create yourself in the root of the project.
Additionally for Vertex AI to work we had to create .env file with
VERTEXAI_PROJECT="your-project-name"
VERTEXAI_LOCATION="your-region"
This is how you can configure Aider, and we chose to do it that way. let's see what @paul-gauthier suggests
Ok, so can you try this:
aider --model vertex_ai/claude-3-5-sonnet@20240620
You need to have VERTEXAI_PROJECT, VERTEXAI_LOCATION and GOOGLE_APPLICATION_CREDENTIALS set in your environment or in a .env file to work with vertex. It's one of the most cumbersome providers to use.
I think that should work for you. If not, can you paste the entire output from where you run that command all the way to the error messages?
@xavitg here is a summary of my setup:
- create
.aider.conf.ymlfile in the root of your project with that content
model: vertex_ai/claude-3-5-sonnet@20240620
- create
.envfile in the root of your project with that content
VERTEXAI_PROJECT="your-project-name"
VERTEXAI_LOCATION="your-region"
You have to be logged in via gcloud to the project that has Vertex AI Do:
gcloud auth login
Then to validate you have the right project set do:
gcloud config get-value project
And if you need to change it:
gcloud config set project your-project-name
Then run Adier like so:
aider --config .aider.conf.yml
This is all that should be needed
Ok, so can you try this:
aider --model vertex_ai/claude-3-5-sonnet@20240620You need to have
VERTEXAI_PROJECT,VERTEXAI_LOCATIONandGOOGLE_APPLICATION_CREDENTIALSset in your environment or in a.envfile to work with vertex. It's one of the most cumbersome providers to use.I think that should work for you. If not, can you paste the entire output from where you run that command all the way to the error messages?
I already did this, but the same problem.. I am using the google cloud "cloud shell" so I just need to set up a .env file with the project id and location. I did it and then: https://prnt.sc/ji70HceR26WM you can see that by default it tries to use the openai api..
- create
.aider.conf.ymlfile in the root of your project with that contentmodel: vertex_ai/claude-3-5-sonnet@20240620
This is working well! Aleluya.
It’s a bit of a hassle having to use that command instead of just ‘aider’ all the time, but at least it works.
Thank you so much for ur time! @paul-gauthier take a look at this :)
@paul-gauthier please consider updating the documentation with that, as I see this is quite a popular request.
credit goes to @zxkxyz, I got to know about that from him
I'm going to close this issue for now, but feel free to add a comment here and I will re-open or file a new issue any time.