openai-cli
openai-cli copied to clipboard
404 Client Error: Not Found for url: https://api.openai.com/v1/completions
Python 3.11.6 (main, Nov 2 2023, 04:39:43) [Clang 14.0.3 (clang-1403.0.22.14.1)] on darwin
~ » echo "Are cats faster than dogs?" | openai complete - yosef.yudilevich@IL-YosefY-2
Traceback (most recent call last):
File "/opt/homebrew/bin/openai", line 8, in <module>
sys.exit(cli())
^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai_cli/cli.py", line 24, in complete
result = client.generate_response(prompt, model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/openai_cli/client.py", line 36, in generate_response
response.raise_for_status()
File "/opt/homebrew/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://api.openai.com/v1/completions
can you help please
I suspect this is due to the default model being removed from OpenAI API. Could you please try with a different model, like this:
echo "Are cats faster than dogs?" | openai complete -m gpt-3.5-turbo -
I don't have API credits left, but I see that the error changes from 404 to 429 when I change the model.
I get the same error as @yosefy when I use "-m gpt-3.5-turbo". Wrapping this tool around the openai python library may make it more stable, rather than relying on the API endpoint to remain fixed. I will also point out that this tool shadows the openai executable associated with the official openai python library.
PRs are welcome. I'm not using the tool myself these days.