[BUG] Azure ChatOpenAI gpt-4-32k has 4096 context token limit
Describe the bug Trying to use Azure ChatOpenAI gpt-4-32k, but getting errors like "This model's maximum context length is 4096 tokens. However, your messages resulted in 4106 tokens (4054 in the messages, 52 in the functions). Please reduce the length of the messages or functions." This happens even after I manually increase the Max Tokens parameter in the chat model.
To Reproduce See screenshots below. Go through conversation until over 4096 tokens in memory.
Screenshots If applicable, add screenshots to help explain your problem.
Hey @matthewsilver, could you try setting it to the max 80 K?
Hi @chungyau97, I set it to 80000 but it still gives me the 4096 token limit error. The error message does change a bit, it now says I requested 80000 tokens in the completion.
Hi @chungyau97, do you have an update on this? Thanks!
strange, can you try to re-add the node, create from scratch? gpt-4-32k should have 32k context limit, not 4k context limit