chatgpt-vscode icon indicating copy to clipboard operation
chatgpt-vscode copied to clipboard

HTTP 400 Bad Request (invalid_request_error context_length_exceeded max_tokens)

Open PierrunoYT opened this issue 1 year ago • 25 comments

Verify it's not a duplicate bug report

Describe the Bug

I get this very often even if I don't use long texts.

HTTP 400 Bad Request. You may have exhausted your OpenAI subscription allowance or have an expired account. You can clear your API Key from VS Code's secrets storage with Genie: Clear API Key command. Check your allowance and account's expiration date here: https://platform.openai.com/account/usage

{ "error": { "message": "This model's maximum context length is 4097 tokens. However, you requested 4925 tokens (2901 in the messages, 2024 in the completion). Please reduce the length of the messages or completion.", "type": "invalid_request_error", "param": "messages", "code": "context_length_exceeded" } }

Please tell us if you have customized any of the extension settings or whether you are using the defaults.

I don't think so

Additional context

No response

PierrunoYT avatar Mar 31 '23 11:03 PierrunoYT

Hi - this happens because your conversation is likely long. And for GPT-* models, the extension provided the whole history for better accuracy.

However, we see the limitations that may bring and we made a change in v0.0.5 to only append system message + last conversation + new conversation to your question. So that you have both historical context without consuming your tokens for the whole conversation.

Please upgrade to v0.0.5 to get the updates

genieai-info avatar Mar 31 '23 11:03 genieai-info

If I start a new conversation it works fine.

PierrunoYT avatar Mar 31 '23 11:03 PierrunoYT

I'm getting the same issue with 0.0.5 regardless of the API key or how I start the conversation. I've tried clearing the credential store completely

image

Science4583 avatar Mar 31 '23 13:03 Science4583

@Science4583 it looks like you have set your max token setting to 4096, could you confirm? Try resetting it or a lower number if you are asking a lengthy question.

This is the default setting:

"genieai.openai.maxTokens": 2048

genieai-info avatar Mar 31 '23 14:03 genieai-info

At this time I'm only asking it "Hello world" to generate any kind of response. I just reran it with the setting reset to default

image

Science4583 avatar Mar 31 '23 14:03 Science4583

Deleted the extension from ~/.vscode/extensions and globalstorage. Reinstalled, added my known working API key image

Science4583 avatar Mar 31 '23 14:03 Science4583

Don't know if helps narrow down the root-cause but I haven't experienced the issue since disabling "Enable Conversation History" yesterday.

Dinoraptor101 avatar Mar 31 '23 14:03 Dinoraptor101

@Science4583 Could you please update the extension check the v0.0.6 - we rolled out a new version just now. let's see if the update helps you change your maxToken settings within extension without needing to restart the vscode. We noticed a bug that may be the reason for maxToken setting to persist unless you restart the vscode. It should now be resolved but curious if you would be able to prompt just hello world now with maxToken set to default

genieai-info avatar Mar 31 '23 14:03 genieai-info

Looks like it's working! Not sure why it fails with a higher maximum limit and the same prompt fails though. I tried changing the value and it would fail with anything higher than 4000, and 3900 would work most of the time

image

Science4583 avatar Mar 31 '23 15:03 Science4583

Great @Science4583 thanks for confirming. There is a prompt-engineered system message to help guide the Genie to respond the best way possible. That's why there is slightly higher nr of tokens used. We have plans to make it configurable in the future releases (You can follow the updates to that via this issue: https://github.com/ai-genie/chatgpt-vscode/issues/39)

@PierrunoYT is it good to close, did the latest version also fix the issue for you?

genieai-info avatar Mar 31 '23 16:03 genieai-info

image how can I deal with it?

maymayuo avatar Apr 01 '23 06:04 maymayuo

bump, I have the same problem

zimiovid avatar Apr 01 '23 10:04 zimiovid

bump, I have the same problem hello, do you solve it now? I

maymayuo avatar Apr 01 '23 11:04 maymayuo

@zimiovid if you asked a very long question OR if you have a very long conversation OR if you changed the maxTokens setting you could hit this issue. A bit more details on your scenario may help.

@maymayuo did you update any settings? Are saying you are seeing max_tokens error response from OpenAI or is your problem different(looks different from your screenshot)

genieai-info avatar Apr 01 '23 20:04 genieai-info

Thanks for your reply. I remember I modified the "Openai: Api Base Url" in setting for addition of Api key. And I did not see max_tokens error response from OpenAI. Maybe I need to register openai, so that I can see this error?I am not sure about.

maymayuo avatar Apr 02 '23 03:04 maymayuo

bump, I have the same problem hello, do you solve it now? I

No

zimiovid avatar Apr 02 '23 16:04 zimiovid

@zimiovid if you asked a very long question OR if you have a very long conversation OR if you changed the maxTokens setting you could hit this issue. A bit more details on your scenario may help.

@maymayuo did you update any settings? Are saying you are seeing max_tokens error response from OpenAI or is your problem different(looks different from your screenshot)

I don’t think so)

image

zimiovid avatar Apr 02 '23 16:04 zimiovid

@zimiovid did you set your maxTokens setting to 4000? It looks like so, please try to lower it to conform with the model's limitations(4096 tokens max, shared between request + response)

genieai-info avatar Apr 02 '23 23:04 genieai-info

No,I never set the max Tokens. It is always 2048. image

maymayuo avatar Apr 04 '23 07:04 maymayuo

when my set max tokens value less than 3900, it works normal, context_length_exceeded error occurs when the value is greater than 3999.

XinxingLi-Chowbus avatar May 11 '23 07:05 XinxingLi-Chowbus

What should I write where it says maxtokens?

PierrunoYT avatar Jun 14 '23 14:06 PierrunoYT

What should I write where it says maxtokens?

go setting, search chatgpt

ShenShu2016 avatar Jun 15 '23 20:06 ShenShu2016

you can change max token into 2048

AlfanDindaR avatar Aug 18 '23 07:08 AlfanDindaR

3900

works the same with me, thanks

jonasrafael avatar Oct 10 '23 06:10 jonasrafael

@genieai-info: max_tokens sets the length of output, right? Is there any setting that defines how long piece of previous conversation is included? In longer conversations GPT often gets derailed and I think it could be fixed by increasing the tokens amount of context (= previous conversation). But can I do it somehow?

erkkimon avatar Sep 05 '24 09:09 erkkimon