Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

[BUG] Azure ChatOpenAI gpt-4-32k has 4096 context token limit

Open matthewsilver opened this issue 1 year ago • 4 comments

Describe the bug Trying to use Azure ChatOpenAI gpt-4-32k, but getting errors like "This model's maximum context length is 4096 tokens. However, your messages resulted in 4106 tokens (4054 in the messages, 52 in the functions). Please reduce the length of the messages or functions." This happens even after I manually increase the Max Tokens parameter in the chat model.

To Reproduce See screenshots below. Go through conversation until over 4096 tokens in memory.

Screenshots If applicable, add screenshots to help explain your problem.

image image

matthewsilver avatar Feb 11 '24 16:02 matthewsilver

Hey @matthewsilver, could you try setting it to the max 80 K?

image

chungyau97 avatar Feb 12 '24 03:02 chungyau97

Hi @chungyau97, I set it to 80000 but it still gives me the 4096 token limit error. The error message does change a bit, it now says I requested 80000 tokens in the completion.

matthewsilver avatar Feb 12 '24 14:02 matthewsilver

Hi @chungyau97, do you have an update on this? Thanks!

matthewsilver avatar Feb 21 '24 19:02 matthewsilver

strange, can you try to re-add the node, create from scratch? gpt-4-32k should have 32k context limit, not 4k context limit

HenryHengZJ avatar Feb 22 '24 15:02 HenryHengZJ