opencommit
opencommit copied to clipboard
[BUG]: stuck on Generating the commit message
Opencommit Version
3.0.3
Node Version
v18.7.0
NPM Version
9.8.1
What OS are you seeing the problem on?
Other Linux Distro
What happened?
stuck on
Expected Behavior
normal output
Current Behavior
stuck
Possible Solution
No response
Steps to Reproduce
No response
Relevant log output
No response
Same here.
Hi, just to confirm I'm having the same problem as @zhiyue and @FrancoLab
will get to it soon, thanks for creating the ticket!
Same problem on windows 10
has this been resolved?
Same issue here. Please let me know if this has been resolved.
I have the same issue, reinstall didn'help. Strange thing is that it worked fine and started to hang without making any changes in the system or config. Node 20.9.0, Windows 10
Not yet resolved guys, if anyone have time to look into it — much appreciated <3
@di-sukharev I made some tests and this is what I came up with:
- the problem happens when it goes inside of
if (res >= MAX_REQUEST_TOKENS) {..., in\node_modules\opencommit\out\cli.jsline 22105 - it goes in to above
ifwhen OCO_OPENAI_MAX_TOKENS is too large. I think people (like me) think that it's being used to specify max number of allowed tokens in the request TO openai but this is max token for the response FROM openai. When I set OCO_OPENAI_MAX_TOKENS to e.g. 500 all works fine again. - I also found one hardcoded value that could be problematic. In the same file
cli.jsDEFAULT_MODEL_TOKEN_LIMIT is set to 4096, currently I use gpt-4 which has max allowed of 8192 tokens so: a) the value is not correct and b) it may change over time anyway and I think it also depends on openai subscription so would be good to have it configurable. - there is also one line that I don't understand=>
const MAX_REQUEST_TOKENS = DEFAULT_MODEL_TOKEN_LIMIT - ADJUSTMENT_FACTOR - INIT_MESSAGES_PROMPT_LENGTH - config6?.OCO_OPENAI_MAX_TOKENSwhy to subsctractOCO_OPENAI_MAX_TOKENS? Basically here was the problem (at least in my case) becauseDEFAULT_MODEL_TOKEN_LIMITis hardcoded to 4096 and I set it to 8192 (currently allowed limit for gpt-4) so the result is negative and it was going insied of thisiffrom point 1. (there is still some issue in thatifexpression)
Unfortunatelly I don't have time to debug this deeper and implement fix but I think it would be good to have simply a check that OCO_OPENAI_MAX_TOKENS can't be larger than DEFAULT_MODEL_TOKEN_LIMIT
Stale issue message
reopening, will take a look at this
I'm using Azure with gpt-4o model getting stuck on generating the commit message. However, when I switch to gpt-35-turbo it's working
Same problem here using Claude 3.5 Sonnet! What is even more strange is that when I log on Anthropic console, it does not show OpenCommit has made a call at all! Also maybe the issue is I deleted bunch of useless files from my repo and the OCO is trying to feed it into the Claude?
Edit: I think yeah feeding CSV in there is a problem
Results_Version3/Bangladesh_clean_extracted.csv
Results_Version3/Bangladesh_processed.csv
Results_Version3/Mexico_SectionA_results_with_synthesis.csv
Results_Version3/Mexico_clean_extracted.csv
Results_Version3/Mexico_processed.csv
Results_Version3/Poland_SectionA_results_20240813_173116_with_synthesis.csv
Results_Version3/Poland_clean_extracted.csv
Results_Version3/Poland_processed.csv
Results_Version3/Switzerland_SectionA_results_20240814_133840_with_synthesis.csv
Results_Version3/Switzerland_clean_extracted.csv
Results_Version3/Switzerland_processed.csv
ui/__pycache__/streamlit_app.cpython-311.pyc
ui/streamlit_app.py
│
â—‹ Generating the commit message
I've had this issue in the past then it somehow resolved and now all of a sudden I'm having it again. My main problem with it is getting left out in the blue without an error message or being able to enable some verbose flag that might give me some insight of what's going on. Just stuck.