opencommit
opencommit copied to clipboard
[Bug]: Request failed with status code 400
Opencommit Version
3.0.4
Node Version
v20.9.0
NPM Version
What OS are you seeing the problem on?
Other Linux Distro
What happened?
A bug happened!
I am having bad request 400 all the time. I have confirmed that:
- my network is good
git diffis not too long (less then 1k tokens)gpt-3.5-turbo,gpt-3.5-turbo-16k,gpt-4will not work for me
Expected Behavior
Commit messages is generated successfully.
Current Behavior
Request failed with status code 400.
Possible Solution
No response
Steps to Reproduce
No response
Relevant log output
┌ open-commit
│
◇ 4 staged files:
arch/.chezmoiscripts/programming/run_after_javascript.sh.tmpl
arch/dot_aicommits.tmpl
arch/dot_opencommit.tmpl
arch/other/pkg/aur.txt.tmpl
│
◑ Generating the commit message.
│
└ ✖ {"model":"gpt-3.5-turbo-16k","messages":[{"role":"system","content":"You are to act as the author of a commit message in git. Your mission is to create clean and comprehensive commit messages as per the conventional commit convention and explain WHAT were the changes and mainly WHY the changes were done. I'll send you an output of 'git diff --staged' command, and you are to convert it into a commit message.\n Do not preface the commit with anything.\n Add a short description of WHY the changes are done after the commit message. Don't start it with \"This commit\", just describe the changes.\n Use the present tense. Lines must not be longer than 74 characters. Use english for the commit message."},{"role":"user","content":"diff --git a/src/server.ts b/src/server.ts\n index ad4db42..f3b18a9 100644\n --- a/src/server.ts\n +++ b/src/server.ts\n @@ -10,7 +10,7 @@\n import {\n initWinstonLogger();\n \n const app = express();\n -const port = 7799;\n +const PORT = 7799;\n \n app.use(express.json());\n \n @@ -34,6 +34,6 @@\n app.use((_, res, next) => {\n // ROUTES\n app.use(PROTECTED_ROUTER_URL, protectedRouter);\n \n -app.listen(port, () => {\n - console.log(`Server listening on port ${port}`);\n +app.listen(process.env.PORT || PORT, () => {\n + console.log(`Server listening on port ${PORT}`);\n });"},{"role":"assistant","content":"fix(server.ts): change port variable case from lowercase port to uppercase PORT to improve semantics\nfeat(server.ts): add support for process.env.PORT environment variable to be able to run app on a configurable port\nThe port variable is now named PORT, which improves consistency with the naming conventions as PORT is a constant. Support for an environment variable allows the application to be more flexible as it can now run on any available port specified via the process.env.PORT environment variable."},{"role":"user","content":"diff --git a/arch/.chezmoiscripts/programming/run_after_javascript.sh.tmpl b/arch/.chezmoiscripts/programming/run_after_javascript.sh.tmpl\nindex 3367563..0ab2d48 100644\n--- a/arch/.chezmoiscripts/programming/run_after_javascript.sh.tmpl\n+++ b/arch/.chezmoiscripts/programming/run_after_javascript.sh.tmpl\n@@ -6,6 +6,7 @@ set -o pipefail\n {{ include \"other/scripts/load-pnpm.sh\" }}\n \n pkg_list=(\n+ opencommit\n speedscope\n )\n pkg_list=(\"${pkg_list[@]/%/\"@latest\"}\")\ndiff --git a/arch/dot_aicommits.tmpl b/arch/dot_aicommits.tmpl\ndeleted file mode 100644\nindex e6a1012..0000000\n--- a/arch/dot_aicommits.tmpl\n+++ /dev/null\n@@ -1,5 +0,0 @@\n-OPENAI_KEY={{ (bitwarden \"item\" \"OPENAI_KEY\").notes }}\n-generate=1\n-model=gpt-3.5-turbo-16k\n-max-length=100\n-type=conventional\ndiff --git a/arch/dot_opencommit.tmpl b/arch/dot_opencommit.tmpl\nnew file mode 100644\nindex 0000000..f0d9b1f\n--- /dev/null\n+++ b/arch/dot_opencommit.tmpl\n@@ -0,0 +1,9 @@\n+OCO_OPENAI_API_KEY={{ (bitwarden \"item\" \"OPENAI_KEY\").notes }}\n+OCO_OPENAI_MAX_TOKENS=undefined\n+OCO_OPENAI_BASE_PATH=undefined\n+OCO_DESCRIPTION=true\n+OCO_EMOJI=false\n+OCO_MODEL=gpt-3.5-turbo-16k\n+OCO_LANGUAGE=en\n+OCO_MESSAGE_TEMPLATE_PLACEHOLDER=$msg\n+OCO_PROMPT_MODULE=conventional-commit\ndiff --git a/arch/other/pkg/aur.txt.tmpl b/arch/other/pkg/aur.txt.tmpl\nindex cc0bef6..35ce0e4 100644\n--- a/arch/other/pkg/aur.txt.tmpl\n+++ b/arch/other/pkg/aur.txt.tmpl\n@@ -1,6 +1,5 @@\n 3dslicer-bin\n 7-zip-full\n-aicommits\n auth-thu-bin\n chatall-bin\n cloudflare-warp-bin"}],"temperature":0,"top_p":0.1,"max_tokens":500}
│
│
◇ 📝 Commit message generated
│
└ ✖ Request failed with status code 400
possible relevant issues: #8, #89, #120, #158, #170.
This continues to be annoying and for large files changes (especially ipynb) since they mix code changes and output changes.
My current solution is to just create the .opencommitignore file with *.ipynb and that handles most of my errors. However it isn't great that it just skips the I believe a different approach should be used.
This issue will be slightly alleviated when the new gpt-4 longer context windows implemented here https://github.com/di-sukharev/opencommit/pull/274
This continues to be annoying and for large files changes (especially ipynb) since they mix code changes and output changes.
My current solution is to just create the
.opencommitignorefile with*.ipynband that handles most of my errors. However it isn't great that it just skips the I believe a different approach should be used.This issue will be slightly alleviated when the new gpt-4 longer context windows implemented here #274
@closedLoop I believe it's not due to large file changes in my case because my git diff is less than 1k tokens as I stated above. No matter how small the changes I make, I still receive a code 400.
I tried to log more error info. Here is the response body I got:
<html>
<head><title>400 The plain HTTP request was sent to HTTPS port</title></head>
<body>
<center><h1>400 Bad Request</h1></center>
<center>The plain HTTP request was sent to HTTPS port</center>
<hr><center>cloudflare</center>
</body>
</html>
I suspect there is a misbehavior when under proxy because node.js doesn't have built-in support for environment-defined proxy.
I believe it has something to do with environment-defined proxy (env: https_proxy) because after I switched to TNU mode the problem is solved. We should add support for proxy based on environment variables.
Stale issue message
reopening, will take a look at this
we now support gpt-4 turbo, try use it, or dont commit extremely large files at once, commit in batches