"Input validation error" on https://huggingface.co/chat when using llama-3-70b
Input validation error: inputs tokens + max_new_tokens must be <= 8192. Given: 6204 inputs tokens and 2047 max_new_tokens
This error occurs again and again even if I try to edit the message to a very short one, or refresh the page. My initial message was relatively long (source code) and failed with this error. Even if I edit it and reduce the length of the message to a single short sentence it still gives the same exact error, that there are 6204 inputs and 2047 max_new_tokens.
Seems like the conversation got a bit long, but I wouldn't expect that to produce an error for any message I try to send?
That is actually a discussion that has been going on here for a while: https://huggingface.co/spaces/huggingchat/chat-ui/discussions/430